- Free Essays, Term Papers, Research Papers and Book Reports

Big Data and Its Implementation in Supply Chain Management: A Case Study of Ups

By:   •  July 15, 2018  •  Research Paper  •  1,508 Words (7 Pages)  •  1,000 Views

Page 1 of 7

Big Data and its implementation in Supply Chain Management: A case study of UPS


Recently, people have witnessed the latest wave of technological progress is the Industry 4.0 where virtual and physical world would be connected. From the IT-perspective, it  involves a new level of networking, devices interact and data integration in production. Therefore, information technology demand has already been necessary for companies to connect the whole value chain. As a part of 4.0 industrial revolution, Big data has gained the notable attention due to the greatest influence that big data has on supply chains is driving the transformation from reactive to proactive. This paper revolves around usage of a technology is Big data and its contribution in a reality business using a case of UPS – a parcel service.


The use of technology in organizations and supply chain has been a determining factor in a competitive advantage for enterprises. That is the main cause traditional supply chain build up the e-supply chain. As e-SCM was created using electronic linkages, it thereby allowed low switching costs, which help supply chain design to be very adaptable to changing trends, competitive coercions (Williams et al, 2002). Since systems and devices are being interconnected, enormous amounts of data are being produced from the barcode, RFID, software systems controlling operations and positioning system devices. Therefore, Big data is an innovation that enables untapped massive data to be collected. There are five characteristics that associated with this technology: volume, velocity, variety, veracity and value; thus, big data makes decisions based upon efficiency benefits (Kim et. al 2016).


Today, through e-commerce sites it could carry network custom marketing according to client’s purchase records, browsing records, which is on account of buyers’ consumption habits forecast for Big data (Li et. al 2015). For example, when buyers do shopping online on eBay or similar websites, if they search for the mobile phone, the website will automatically suggest battery backup or phone covers. It is obviously electronic supply chain management is accountable for optimizing the business processes and value in every corner of the extended operation, from supplier’s supplier to customer’s customer (Dhar & Mazumder 2014). Big data has modified the purely logistical function of reported sales, focus on ensuring the supply from upstream to downstream has now become an independent function in supply chain management. It is supplemented with news, weather, events to bring the insight and build scenarios before incidents occur (Haghighat 2008). Take an example of UPS which is a multinational package delivery company, handle about 15 million packages per day, they need to find dynamic route optimization in regard to distance, fuel and time. From the managers view, if there are too many resources and vehicles in one distribution route, then more money are spending than it has to, and that assets possibly could be better utilized somewhere else (Ittmann 2015). Still, if managers underestimate the number of vehicles in a particular delivery could require, then they will face the risk of dispatching a late shipment. Considering other factors such as the promised delivery time, traffic situations; therefore, a system called On-Road Integrated Optimization and Navigation or Orion is applied. Analyzing the collected data, Ups figured out the reason why they wasted fuel and delayed due to trucks had to wait for a gap in traffic. It also combines the clients’ shipping requirements and the customized map data which give the driver accuracy routing instruction to reduce miles the distance the truck travel and CO2. To logistics, optimization is the most interesting feature because it saves money and avoids late shipments (Pratt 2015). As customers; they want to track where their order is; simultaneously expect it to be delivered promptly. They donot hesitate to post any complaints or comments on social media, those data  are altering the traditionally fragmented industry (Hey and Trefethen, 2003). ORION also benefit customer when providing personalized services allow them to see upcoming home deliveries or actively reroute the shipment, dates as needed. But having said that there are some matters such as: complaints of drivers since they follow Orion in reality has given drivers undue stress on longer working hours on the street because it creats more stops per mile as it does not figure in stop lights and input the business that close early at all. Due to the lack of social life outside, not even excluding sleep, a lots grievances are filed by workers every day at UPS and a mass shooting took place with 4 drivers died (Budman et. al 2017). The overall huge challenge that UPS faced in implementing this project is creating right algorithms, how to make it work in practice then deploying out to 55,000 drivers and staff at Ups facilities.

Only about 15% of big data projects are in production, no doubt there is a very long way for Big data to go before it could reach the plateau productivity (Kitchin 2014). Opportunities come along with challenges. In the sense, data storage is not always easily accomplished because when the volume of data keeps getting bigger rapidly, the first and foremost issue would be the storage (Wang et al. 2016). And Grid Computing has high storage ability and the processing power can translate to data and computational grids. For the scalability challenge, keeping data in a cloud could be secured as it can be nearly unlimited space. Additionally, data itself is not valuable; people need to make sure data meet requirements to be worthwhile. Plenty of people concentrate on collecting and storing data without having the capability to take advantage of it, and that is actually a matter because too much data makes the information overload. Then by binning data togther, users can more effectively visualize to see the data quality concerned, where outliers and non-required data situate. Whether innovative enterprises are big technology companies or startups might need to invest million dollars in developing programs with algorithms for its usefulness to clean the data, being perfect for controlling and maintaining data (Bernasek 2015). To variety issue, it is recommended that process data through an open source like Apache Hadoop since its function to divide data among multiple systems infrastructure. More importantly, people concern about the privacy, the collection of Big Data may be accompanied by information likely without permission of users, and that would violate the law in some countries. From a security perspective, the main solution is the adequate use of encryption; it is vital to generate policies that only permit authorized users access to data and it should be in real time security monitoring. Threat intelligence is essential to be used in attack detection because it analyzes both internal and external threats to organizations in a systematic way. The next step in the evolution, people will go to the Zeta byte era, and Big data with the on rising of the Internet of Things will be critical for the business to comprehend business opportunities and dangers in real time, because changes are not just constant but also exponential.


Download:  txt (9.7 Kb)   pdf (124.1 Kb)   docx (16.7 Kb)  
Continue for 6 more pages »