I will use a very simple model that lends itself very nicely to the SQL environment that was written up by CERN researches (hard to beat nuclear scientists: http://ais.web.cern.ch/AIS/apps/lims/theory.html). Rather than group data into monthly buckets and extrapolate using models like exponential smoothing, this approach offers a "mathematical solution based on the real statistical distributions."
Inventory management algorithms use some variation of the order point model: when inventory for a given stock keeping unit (SKU) falls below an order point the system recommends purchase/manufacture of more (with the quantity usually based on the economic order quantity, a topic for another day). The order point is calculated as quantity demanded over the lead time plus safety stock. Lead time is the time between noticing more is required and having more available in inventory for immediate sale and includes time for: inventory review + purchase order placing + vendor fulfillment + in transit + receiving + put-away and is usually measured in days or weeks. Lead time quantity is the quantity expected to be sold over the lead time. Safety stock is the inventory held for unpredictable future demand fluctuation. Typically it is set to achieve some level of fill associated with the business' value proposition, say 95% of customer orders should be filled from stock. Assuming the demand is normally distributed, then safety stock is the number of standard deviations associated with service level times the lead time standard error.
Modeling requires six elements:
- orders per day (opd) = count of orders for some period/days in period
- average order quantity (aoq) = sum of quantity per order/count of orders
- order quantity variance (oqv) = sum of squared difference between the order quantity and average order quantity / count of orders minus 1
- average lead time in days (altd)
- lead time in days variance (ltdv)
- number of standard deviations associated with service level (z)
- lead time quantity (ltq) = opd*altd*aoq
- lead time variance (ltv) = opd*altd*(oqv + aoq^2) +opd^2 *aoq^2*ltdv
- order point (op) = ltq+z*ltv^.5
As an aside, many vendors specify their lead time which can be interpreted as altd with zero variance simplifying #2 above to ltv = opd*altd*(oqv + aoq^2).
The program employs three tables which are stripped to the essential information for the matters at hand. The first is #item which holds the item list and relevant SKU specific planning parameters such as the days in stock useful for new items or adjusting the forecast for stock outs, targeted service level which my vary based on an item importance, and item cost data which comes in handy when computing carry costs, average inventory costs and economic order quantities. The second table is #receipts to measure the lead time. A purchase order has an order date followed by one or more receipt dates which are useful in estimating lead time. The final table is #salesorders which is self explanatory. the one observation I would make is that the sales order date should be the date the customer requested, not the actual ship date.
create table #item (item varchar(3), daysinstock int, servicetarget float,stdcost float, carrycost float, ordercost float)
insert into #item select 'abc', 120, .95, 1.5, .12, 15
create table #receipts (purchaseorder int, item varchar(3), orderdate int, receiptdate int)
insert into #receipts select 321,'abc', 20130109, 20130226
insert into #receipts select 432,'abc', 20130325, 20130413
insert into #receipts select 543,'abc', 20131007, 20131130
create table #salesorders (salesorder int, salesreqdate int, item varchar(3), quantity int)
insert into #salesorders select 123, 20130208, 'abc', 1
insert into #salesorders select 234, 20131014, 'abc', 10
insert into #salesorders select 345, 20130413, 'abc', 35
insert into #salesorders select 456, 20130409, 'abc', 10
insert into #salesorders select 567, 20131105, 'abc', 5
insert into #salesorders select 678, 20130515, 'abc', 5
Using the built-in SQL functions for average, variance and count summarize the data using sub-queries for the receipts data and the sales order data and place in a summary table with holding places for lead-time quantity (ltq), lead time variance(ltv), order point (op) and economic order quantity (eoq). While all of these could be calculated in the first pass, I broke it into updates to make following easier.
select i.item, i.daysinstock,servicetarget, stdcost,carrycost, ordercost, altd,ltdv ,ofreq/daysinstock opd, aoq,oqv,
cast(0 as float) ltq,
cast(0 as float) ltqv,
cast(0 as int) op,
cast(0 as int) eoq
into #summary from #item i
join
(select item, cast(count(1) as float) rfreq, avg(cast(datediff(d,convert(char,orderdate,112), convert(char,receiptdate,112)) as float)) altd ,
var(cast(datediff(d,convert(char,orderdate,112), convert(char,receiptdate,112)) as float)) ltdv
from #receipts group by item) lt
on i.item = lt.item
join
(select item, cast(count(1) as float) ofreq, avg (cast(qty as float)) aoq, var(cast(qty as float)) oqv from #salesorders group by item) so
on i.item = so.item
Use update statements to add the lead time quantity and lead time variance:
update #summary
set ltq = opd*altd*aoq,
ltqv = opd*altd*(oqv + power(aoq,2)) +power(opd,2)*power(aoq,2)*ltdv
The final step uses an update step to create the order point and the economic order quantity(eoq). See http://en.wikipedia.org/wiki/Economic_order_quantity for a very thorough definition. Also, note the use of dbo.fx_normsinv which is a function to convert a targeted service level (95%) to a z score. I've included the code in an appendix.
update #summary
set op = cast(ltq+ power(ltqv, 0.5)*dbo.fx_normsinv(servicetarget) as int),
eoq = cast(power((opd*365*aoq *ordercost)/(carrycost*stdcost),.5) as int)
In addition to planning, the data can also provide insight into performance and KPIs. For example, average inventory can be approximated as one half the order quantity plus safety stock, cost of goods sold (cogs) and projected turns as cogs/avginv:
select
sum((eoq/2+ (op-ltq))*stdcost) avginv,
sum(opd*365*aoq*stdcost) cogs,
sum(opd*365*aoq*stdcost)/sum((eoq/2+ (op-ltq))*stdcost) turns
from #summary
That's it. I employed a slightly extended version of this to run inventory planning on nearly 200,000 SKUs with an inventory of over $100 million achieving a 95+% service level.
Appendix:
Inverse of normal distribution per Abramowitz and Stegun formula 26.2.23
create function fx_normsinv(@p float)
returns float
as
begin
declare @c0 float, @c1 float, @c2 float, @d1 float, @d2 float , @d3 float, @t float
if @p>=.5 set @p = 1.0-@p
select @t = power(-2.0*log(@p),.5), @c0 = 2.515517, @c1 =0.802853,@c2 =0.010328,@d1 = 1.432788, @d2 =0.189269, @d3 =0.001308
return (select @t-((@c0+@c1*@t+@c2*@t*@t)/(1.0+@d1*@t+@d2*@t*@t+@d3*@t*@t*@t)))
end
No comments:
Post a Comment