Partial Sum Query

JaedenRuiner

Well-known member
Joined
Aug 13, 2007
Messages
340
Programming Experience
10+
Alrighty,

You all have been awesome, and have gotten me this far, and though I have many ways to do this next part (one of which i'll be coding now) but I was thinking to ask in case there was a better way to do it.

I have four tables:
Customer
BOL (bill of lading)
Email
Material

Customer is PK to BOL FK and Email FK

Each BOL has up to 4 unique Materials, and their quantities.

The IDEA:

After inserting all the BOLs from the XLS file into the BOL Table, I need to break them down OVER Customer ID's, but split the individual BOL ids into the EMail table via a Partial Sum of BOL Materials.

You confused yet. Okay, here's the example:

Customer
  1. ID=100
    BOL
    • ID=27364
      • MatID=1
      • Qty=125
      • MatID=2
      • Qty=1000
    • ID=27365
      • MatID=1
      • Qty=200
      • MatID=2
      • Qty=650
    • ID=27366
      • MatID=1
      • Qty=40
      • MatID=2
      • Qty=22
    • ID=27367
      • MatID=1
      • Qty=75

My target is 350, so for Customer 100, they have 4 BOLs where Material ID 1 breaks 350 at the 3rd entry. So I'm looking so have marker that when the Sum of MatID 1's Qty > 350, mark all those BOLs as children of an entry in the Email table, and then Create a new Email for that customer for the next set of BOLs.

I can do this programmitically, retaining the sum for myself, as well, the insertion into the Email table is handled with a stored procedure, since the Email_ID PK is a formatted field based on the Customer ID and an Incrementing Number per customer, but I was curious if there was a way this could be done with SQL and the Server doing some of the work for me. *chuckle*

thanks
 
What about MatID 2?

Unnecessary really, but i'm importing and processing Bills of Lading, so they have multiple materials per shipment. We could easily count MatID 2 to 500, or matid 3 to 1500 for the same affect, but MatID 1 was the one my boss gave me as the trigger element.

I'm long past this now, and found that a premade query (view's i think they are calling SQL Server) and a filtered scalar function works beautifully for what I need.

Thanks Though,
 
I'd have used an analytic to sum the result of a conditional (when amtid=1 then value else 0) and divide it by 350, truncing the decimal would have given a new ID for every 350 increment.. partition by the customer number and youre pretty much there. Other analytics could be used to perform rolling sums and increment counters whenever a div-by-350 difference between the current roll and the previous roll was detected..
 
I'd have used an analytic to sum the result of a conditional (when amtid=1 then value else 0) and divide it by 350, truncing the decimal would have given a new ID for every 350 increment.. partition by the customer number and youre pretty much there. Other analytics could be used to perform rolling sums and increment counters whenever a div-by-350 difference between the current roll and the previous roll was detected..

I'm not sure that would work, or at least i would have to rework all the processing. It's seriously a 16-20 stage process of importing from Excel and cleaning the table, exporting errors, etc. All done through SQL commands. This part was the final stage, so I simply, just do a cyclical update command for each BOL ID per CustID (which usually has 3-4 MAT IDs) and at each pass do a ExecuteScalar on the filtered View. This allows me the ability to Determine the Quantity >= 350, as well as flagging each BOL ID with the correlating parent EmailID, and when i'm over 350 I execute the StoredProc on the Email Table and have the next EmailID for the continuing sequence.
The second phase (the user has two phases to execute) is the processing of the emailID entries that are over 350 and have not been emailed, and emails them to the clients, generating attachments and such.
I did find, however, that using the premade View (with a sum() column and group by clause) was faster than utilizing a SUM() ExecuteScalar on the fly. *shrug*
 
Back
Top