June 23-27 International Material Handling Research Colloquium – Cincinnati, OH

Intelligrated HQ

The International Material Handling Research Colloquium (IMHRC) purpose is to share world-class research accomplishments, projects and trends in the field of material handling, facility logistics and intralogistics.

It aims to facilitate dialog and collaborative research by teams of university researchers on leading edge topics of interest to end users as well as technology and solutions providers. The Colloquium operates on an immersion philosophy of complete participation by all participants in all the Colloquium events. The Colloquium program includes a mix of invited presentations, facilitated discussions, poster sessions, facility tours and social events.

The first Research Colloquium took place June 1990 on the corporate campus of Litton Industrial Automated Systems in Hebron, Kentucky, USA. The success of this Colloquium prompted the conduct of other Colloquia: in 1992 in Milwaukee, Wisconsin, USA at the corporate headquarters of Rockwell Automation/Allen-Bradley; in 1994 in Grand Rapids, Michigan, USA at the corporate headquarters of Rapistan Demag Corporation; in 1996 in ‘s-Hertogenbosch, the Netherlands, at the corporate headquarters of Vanderlande Industries; in 1998 in Chandler, Arizona, USA at the headquarters of Motorola Corporation; in 2000 in York, Pennsylvania, USA at the headquarters of St. Onge Company; in 2002 in Portland, Maine, USA at the headquarters Southworth International; in 2004 in Graz, Austria on the campus of Technical University of Graz with additional financial support provided by Knapp, Salomon, SSI Schaefer Peem, and TGW; in 2006 in Salt Lake City, Utah, USA at the headquarters of Daifuku America; in 2008 in Dortmund, Germany hosted by the Fraunhofer Institute for Material Flow and Logistics (IML) at the University of Dortmund with support provided by Beumer and Savoye; in 2010 in Milwaukee, USA, financially supported by RedPrairie and HK Systems, with the Center for Supply Chain Management at Marquette University serving as academic host; and in 2012 in Gardanne, France, hosted and financially supported by École Nationale Supérieure des Mines de SaintÉtienne (EMSE) who served as academic host. The 2014 industrial host is Intelligrated.

Andrew Johnson’s research lab presented “Order batching with time constraints in a parallel-aisle warehouse: a multiple-policy approach”. This paper investigate the potential gains in terms of reduced blocking and delays in order picking systems when multiple routing strategies and order gather strategies are considered. The paper will appear as part of the conference proceedings.

http://www.mhi.org/cicmhe/colloquium

June 17: University of Science and Technology of China – Hefei

logo20135

Stochastic nonparametric approach to efficiency analysis: A Unified Framework

Efficiency analysis is an essential and extensive research area that provides answers to such important questions as: Who are the best performing firms and can we learn something from their behavior? What are the sources of efficiency differences across firms? Can efficiency be improved by government policy or better managerial practices? Are there benefits to increasing the scale of operations? These are examples of important questions we hope to resolve with efficiency analyses.

Efficiency analysis is an interdisciplinary field that spans such disciplines as economics, econometrics, operations research and management science, and engineering, among others. The methods of efficiency analysis are utilized in several fields of application including agriculture, banking, education, environment, health care, energy, manufacturing, transportation, and utilities, among many others. Efficiency analysis is performed at various different scales. Micro level applications range from individual persons, teams, production plants and facilities to company level and industry level efficiency assessments. Macro level applications range from comparative efficiency assessments of production systems or industries across countries to efficiency assessment of national economies. Indeed, efficiency improvement is one of the key components of productivity growth (e.g., Färe et al., 1994), which in turn is the primary driver of economic welfare.

Unfortunately, there currently is no commonly accepted methodology of efficiency analysis, but the field is divided between two competing approaches: Data envelopment Analysis (DEA) and Stochastic Frontier Analysis (SFA). Bridging the gap between axiomatic DEA and stochastic SFA was for a long time one of the most vexing problems in the field of efficiency analysis. The recent works on convex nonparametric least squares (CNLS) by Kuosmanen (2008), Kuosmanen and Johnson (2010), and Kuosmanen and Kortelainen (2012) have led to the full integration of DEA and SFA into a unified framework of productivity analysis, which we refer to as stochastic nonparametric envelopment of data (StoNED).

We see the development of StoNED as a paradigm shift for efficiency analysis. It is no longer necessary to decide if modeling noise is more important than imposing axioms of production theory: we can do both using StoNED. The unified framework of StoNED offers deeper insights to the foundations of DEA and SFA, but it also provides a more general and flexible platform for efficiency analysis and related themes such as frontier estimation and production analysis. Further, a number of extensions to the original DEA and SFA methods have been developed over the past decades. The unified StoNED framework allows us to combine the existing tools of efficiency analysis in novel ways across the DEA-SFA spectrum, facilitating new opportunities for further methodological development.

This presentation’s main objective is to describe CNLS and StoNED methods.

June 4th: North American Productivity Workshop

NAPW2

The North American Productivity Workshop (NAPW) is a biennial conference held in North America in even years and its sister conference European Workshop on Efficiency and Productivity Analysis (EWEPA) is held in odd years. It brings together researchers in economics, operations research, management science, engineering and a wide variety of application areas to discuss the latest innovations in efficiency and productivity research.

The 9th NAPW was held in Ottawa Canada from June 4th to the 7th. Keynote speakers included Ariel Pakes, Thomas Professor of Economics (Harvard University), Erwin Diewert, Professor in the Vancouver School of Economics (University of British Columbia)(University of British Columbia), William Greene, Robert Stansky Professor of Economics and Toyota Motor Corp. Professor of Economics (New York University), John C. Haltiwanger, Dudley and Louisa Dillard Professor of Economics and Distinguished University Professor (University of Maryland), Dale Jorgenson, Samuel W. Morris University Professor of Economics (Harvard University), and Robin C. Sickles,Reginald Henry Hargrove Professor of Economics, Professor of Statistics (Rice University and Visiting Professor of Production Econometrics University of Loughborough).

Andrew Johnson’s research team made two presentations (links below). Both will be chapters in José Luis Preciado Arreola Ph.D. dissertation, the first of which has been accepted for publication in the American Journal of Agricultural Economics.

A Birth-Death Markov Chain Monte Carlo method to estimate the number of states in a state- contingent production frontier model”, José Luis Preciado Arreola (Texas A&M University), Andrew (Andy) Johnson (Texas A&M University).

“A Semi-parametric Bayesian Concave Regression Method to Estimate Production Frontiers”, José Luis Preciado Arreola (Texas A&M University), Andrew (Andy) Johnson (Texas A&M University).

http://www.napw2014.com

May 15: Harvard Business School

Hawes_051101-SC-HBS251101-640x360

10 Years of the World Management Survey: Lessons and Next Steps
How Much Does Management Affect Productive Performance? New Insights from a semi-nonparametric Analysis of the World Management Survey

Abstract
It remains a challenge to demonstrate the effects of management for improving performance beyond simply relying on case studies and anecdotal evidence. The World Management Survey presents a unique opportunity to look more closely at the relationship between management and performance. This paper critiques prior research and offers alternative semi-nonparametric estimation techniques. Findings reveal that the effect of management vary significantly across countries, that some management practices are more important than others, and that management has a significant effect on output, even in a cross-sectional analysis.

http://www.hbs.edu/faculty/conferences/2014-world-management-survey/Pages/default.aspx

March 7: Aalto University

HelsinkiChydenia

Department of Service and Information Economy – Helsinki, Finland
Analysis and Control of Batch Order Picking Processes Considering Picker Blocking

Abstract:
Order picking operations play a critical role in the order fulfillment process of distribution centers (DCs). Picking a batch of orders is often favored when customers’ demands create a large number of small orders, since the traditional single order picking process results in low utilization of order pickers and significant operational costs. Specifically, batch picking improves order picking performance by consolidating multiple orders in a “batch” to reduce the number of trips and total travel distance required to retrieve the items. As more pickers are added to meet increased demand, order picking performance is likely to decline due to significant picker blocking. However, in batch picking, the process of assigning orders to particular batches allows additional flexibility to reduce picker blocking.

This research aims to identify, analyze, and control, or mitigate, picker blocking while batch picking in picker-to-part systems. We first develop a large-scale proximity-batching procedure that can enhance the solution quality of traditional batching models to near-optimality as measured by travel distance. Through simulation studies, picker blocking is quantified. The results illustrate: a) a complex relationship between picker blocking and batch formation; and b) a significant productivity loss due to picker blocking.

Based on our analysis, we develop additional analytical and simulation models to investigate the effects of picker blocking in batch picking and to identify the picking, batching, and sorting strategies that reduce congestion. A new batching model (called indexed order batching model (IBM)) is proposed to consider both order proximity and picker blocking to optimize the total order picking time. We also apply the proposed approach to bucket brigade picking systems where hand-off delay as well as picker blocking must be considered.

The research offers new insights about picker blocking in batch picking operations, develops batch picking models, and provides complete control procedures for large-scale or dynamic batch picking situations. The twin goals of added flexibility and reduced costs are highlighted throughout the analysis. This is collaborate work with Soondo Hong, Assistant Professor at Pusan National University and Brett Peters, the Dean of the College of Engineering & Applied Science at University of Wisconsin-Milwaukee.

March 5: University of Leuven

bannerGroup for the Advancement of Revealed Preferences – Kortrijk, Belgium

Orthogonality conditions for identification of joint production technologies
Axiomatic nonparametric approach to the estimation of stochastic distance functions

Abstract:
The classic econometric approach treats productivity as a residual term of the standard microeconomic production model. Critics of this approach argue that productivity shocks correlate with the input factors that are used as explanatory variables of the regression model, which causes an endogeneity problem. This paper sheds some new light on this issue from the perspective of the production theory. We first examine the standard cost minimization problem to demonstrate that even if the observed inputs and outputs are endogenous, consistent estimation of the input distance function is possible under certain conditions. This result reveals that the orthogonality conditions required for econometric identification critically depend on the specification of the distance metric, which suggests the directional distance function as one possible solution to the endogeneity problem. We then introduce a stochastic data generating process of joint production where all inputs and outputs correlate with inefficiency and noise. We show that an appropriately specified direction vector can provide the orthogonality conditions required for identification of the directional distance functions. A consistent nonparametric estimator of the directional distance function is developed, which satisfies the essential axioms of the production theory. Specification of the direction vector is examined through an application to electricity distribution firms.

February 19: Queen’s University

institution_full_831_Queens_University_Belfast_Main_Building20120906-2-1dn8g3o

Finance Research Group Seminar – Belfast, Northern Ireland
Benchmarking managerial performance: a Stochastic semi-Nonparametric Envelopment of Data Approach

The Finnish electricity market has a competitive energy generation market and a monopolistic transmission system. To regulate the local monopoly power of network operators, the government regulator uses frontier estimation methods (e.g., Stochastic Frontier Analysis (SFA) and nonparametric Data Envelopment Analysis (DEA)) to identify excessive transmission costs, taking into account outputs and the operating environment. We describe the new regulatory system developed for the Finnish regulator, which is based on the method Stochastic Non-smooth Envelopment of Data (StoNED) and utilizes panel data to detect the excessive costs from random noise.

The literature of productive efficiency analysis is divided into two main branches: the parametric SFA and nonparametric DEA. StoNED is a new frontier estimation framework that combines the virtues of both DEA and SFA in a unified approach to frontier analysis. StoNED follows the SFA approach by including a stochastic component. In contrast to SFA, however, the proposed method does not make any prior assumptions about the functional form of the production function. In that respect, StoNED is similar to DEA, and only imposes free disposability, convexity, and some returns to scale specification.
The main advantage of the StoNED approach to the parametric SFA approach is the independence of the ad hoc parametric assumptions about the functional form of the production function (or cost/distance functions). In contrast to the flexible functional forms, one can impose monotonicity, concavity and homogeneity constraints without sacrificing the flexibility of the regression function. Additionally, the main advantage of StoNED to the nonparametric DEA approach is robustness to outliers, data errors, and other stochastic noise in the data. In DEA the frontier is spanned by a relatively small number of efficient firms, however, in our method all observations influence the shape of the frontier. Also many standard tools from parametric regression such as goodness of fit statistics and statistical tests are directly applicable in our approach. This is collaborate work with Timo Kuosmanen of the Business School at Aalto University.

November 21: University of Southern Denmark

syddansk universitet logo

Department of Business and Economics
Regulating Local Monopolies in Electricity Transmission: A Real-world Application of the StoNED Method

Abstract:
The Finnish electricity market has a competitive energy generation market and a monopolistic transmission system. To regulate the local monopoly power of network operators, the government regulator uses frontier estimation methods (e.g., Stochastic Frontier Analysis (SFA) and nonparametric Data Envelopment Analysis (DEA)) to identify excessive transmission costs, taking into account outputs and the operating environment. We describe the new regulatory system developed for the Finnish regulator, which is based on the method Stochastic Non-smooth Envelopment of Data (StoNED) and utilizes panel data to detect the excessive costs from random noise.

The literature of productive efficiency analysis is divided into two main branches: the parametric SFA and nonparametric DEA. StoNED is a new frontier estimation framework that combines the virtues of both DEA and SFA in a unified approach to frontier analysis. StoNED follows the SFA approach by including a stochastic component. In contrast to SFA, however, the proposed method does not make any prior assumptions about the functional form of the production function. In that respect, StoNED is similar to DEA, and only imposes free disposability, convexity, and some returns to scale specification.
The main advantage of the StoNED approach to the parametric SFA approach is the independence of the ad hoc parametric assumptions about the functional form of the production function (or cost/distance functions). In contrast to the flexible functional forms, one can impose monotonicity, concavity and homogeneity constraints without sacrificing the flexibility of the regression function. Additionally, the main advantage of StoNED to the nonparametric DEA approach is robustness to outliers, data errors, and other stochastic noise in the data. In DEA the frontier is spanned by a relatively small number of efficient firms, however, in our method all observations influence the shape of the frontier. Also many standard tools from parametric regression such as goodness of fit statistics and statistical tests are directly applicable in our approach. This is collaborate work with Timo Kuosmanen of the Business School at Aalto University.