TECHNOLOGY IN LOGISTICS
Please enquire with us on the uses of the below technology within logistics
Application Program Interface – A programmer will code interactions to the database (sometimes referred to as a datasource) via an application program interface (API) or via a database language. The particular API or language chosen will need to be supported by DBMS, indirectly via a pre-processor or a bridging API. Some API’s aim to be database independent.
Artificial Intelligence – is intelligence demonstrated by machines, in contrast to the natural intelligence displayed by humans. AI textbooks define the field as the study of “intelligent agents“: any device that perceives its environment and takes actions that maximize its chance of successfully achieving its goals. Colloquially, the term “artificial intelligence” is often used to describe machines (or computers) that mimic “cognitive” functions that humans associate with the human mind, such as “learning” and “problem solving”. As machines become increasingly capable, tasks considered to require “intelligence” are often removed from the definition of AI, a phenomenon known as the AI effect. For instance, optical character recognition is frequently excluded from things considered to be AI, having become a routine technology. Modern machine capabilities generally classified as AI include successfully executing autonomous tasks, intelligent routing in content delivery networks etc. In the twenty-first century, AI techniques have experienced a resurgence following concurrent advances in computer power, large amounts of data, and theoretical understanding; and AI techniques have become an essential part of the technology industry, helping to solve many challenging problems in computer science, software engineering and operations research. AI often revolves around the use of algorithms. An algorithm is a set of unambiguous instructions that a mechanical computer can execute. A complex algorithm is often built on top of other, simpler, algorithms.
Big Data – is a field that treats ways to analyze, systematically extract information from, or otherwise deal with data sets that are too large or complex to be dealt with by traditional data-processing application software. Data with many cases (rows) offer greater statistical power, while data with higher complexity (more attributes or columns) may lead to a higher false discovery rate. Big data challenges include capturing data, data storage, data analysis, search, sharing, transfer, visualization, querying, updating, information privacy and data source. Big data was originally associated with three key concepts: volume, variety, and velocity. When we handle big data, we may not sample but simply observe and track what happens. Therefore, big data often includes data with sizes that exceed the capacity of traditional software to process within an acceptable time and value.
BlockChain – is a growing list of records, called blocks, that are linked using cryptography. Each block contains a cryptographic hash of the previous block, a timestamp, and transaction data (generally represented as a Merkle tree). By design, a BlockChain is resistant to modification of the data. It is “an open, distributed ledger that can record transactions between two parties efficiently and in a verifiable and permanent way”. For use as a distributed ledger, a BlockChain is typically managed by a peer-to-peer network collectively adhering to a protocol for inter-node communication and validating new blocks. Once recorded, the data in any given block cannot be altered retroactively without alteration of all subsequent blocks, which requires consensus of the network majority. Although BlockChain records are not unalterable, Decentralized consensus has therefore been claimed with a BlockChain.
Databases – A database is an organized collection of data, generally stored and accessed electronically from a computer system. The database management system (DBMS) is the software that interacts with end users, applications, and the database itself to capture and analyze the data. The DBMS software additionally encompasses the core facilities provided to administer the database. The sum total of the database, the DBMS and the associated applications can be referred to as a “database system”. Often the term “database” is also used to loosely refer to any of the DBMS, the database system or an application associated with the database.
Data Hub or E-Hub Services – A data hub differs from a data warehouse in that it is generally unintegrated and often at different grains. It differs from an operational data store because a data hub does not need to be limited to operational data. A data hub differs from a data lake by homogenizing data and possibly serving data in multiple desired formats, rather than simply storing it in one place, and by adding other value to the data such as de-duplication, quality, security, and a standardized set of query services. A Data Lake tends to store data in one place for availability, and allow/require the consumer to process or add value to the data. Data Hubs are ideally the “go-to” place for data within an enterprise, so that many point-to-point connections between callers and data suppliers do not need to be made, and so that the Data Hub organization can negotiate deliverables and schedules with various data enclave teams, rather than being an organizational free-for-all as different teams try to get new services and features from many other teams.
Digitization – is the process of converting information into a digital (i.e. computer-readable) format, in which the information is organized into bits. The result is the representation of a document by generating a series of numbers that describe a discrete set of points or samples. The result is called digital representation or, more specifically, a digital image, for the object, and digital form, for the signal. In modern practice, the digitized data is in the form of binary numbers, which facilitate computer processing and other operations, but, strictly speaking, digitizing simply means the conversion of analog source material into a numerical format; the decimal or any other number system that can be used instead. Digitization is of crucial importance to data processing, storage and transmission, because it “allows information of all kinds in all formats to be carried with the same efficiency and also intermingled”. Though analog data is typically more stable, digital data can more easily be shared and accessed and can, in theory, be propagated indefinitely, provided it is migrated to stable formats as needed. This is why it is a favored way of preserving information for many organizations.
Machine Learning – Machine learning (ML) is the scientific study of algorithms and statistical models that computer systems use to perform a specific task without using explicit instructions, relying on patterns and inference instead. It is seen as a subset of artificial intelligence. Machine learning algorithms build a mathematical model based on sample data, known as “training data“, in order to make predictions or decisions without being explicitly programmed to perform the task. Machine learning algorithms are used in a wide variety of applications, such as email filtering, invoicing and computer vision, where it is difficult or infeasible to develop a conventional algorithm for effectively performing the task. Machine learning is closely related to computational statistics, which focuses on making predictions using computers. The study of mathematical optimization delivers methods, theory and application domains to the field of machine learning.
Data mining is a field of study within machine learning, and focuses on exploratory data analysis through unsupervised learning. In its application across business problems, machine learning is also referred to as predictive analytics. A very useful tool for creating Business Intelligence and dashboard reporting.
Power BI Dashboards – Power BI provides cloud-based BI services, known as “Power BI Services”, along with a desktop based interface, called “Power BI Desktop”. It offers data warehouse capabilities including data preparation, data discovery and interactive dashboards. Microsoft released an additional service called Power BI Embedded on its Azure cloud platform. One main differentiator of the product is the ability to load custom visualizations.
Optical Character Recognition – is the electronic or mechanical conversion of images of typed, handwritten or printed text into machine-encoded text, whether from a scanned document, a photo of a document, a scene-photo (for example the text on signs and billboards in a landscape photo) or from subtitle text superimposed on an image (for example from a television broadcast). Widely used as a form of data entry from printed paper data records – whether delivery notes, proof of deliveries, invoices, mail, printouts of static-data, or any suitable documentation – it is a common method of digitizing printed texts so that they can be electronically edited, searched, stored more compactly, displayed on-line, and used in machine processes such as cognitive computing, machine translation, (extracted) text-to-speech, key data and text mining.
Early versions needed to be trained with images of each character, and worked on one font at a time. Advanced systems capable of producing a high degree of recognition accuracy for most fonts are now common, and with support for a variety of digital image file format inputs. Some systems are capable of reproducing formatted output that closely approximates the original page including images, columns, and other non-textual components.
Smart contracts – BlockChain-based smart contracts are proposed contracts that can be partially or fully executed or enforced without human interaction. One of the main objectives of a smart contract is automated escrow. An IMF staff discussion reported that smart contracts based on BlockChain technology might reduce moral hazards and optimize the use of contracts in general. But “no viable smart contract systems have yet emerged.” Due to the lack of widespread use their legal status is unclear.
There are a number of efforts and industry organizations working to employ BlockChains in supply chain logistics and supply chain management. TradeLens is the industry leader in supply chain, already in use and being widely adopted globally. The BlockChain in Transport Alliance (BiTA) works to develop open standards for supply chains. Walmart and IBM are running a trial to use a BlockChain-backed system for supply chain monitoring — all nodes of the BlockChain are administered by Walmart and are located on the IBM cloud. Hyperledger Grid develops open components for BlockChain supply chain solutions.
Currently, there are at least four types of BlockChain networks — public BlockChains, private BlockChains, consortium BlockChains and hybrid BlockChains.
A public BlockChain has absolutely no access restrictions. Anyone with an Internet connection can send transactions to it as well as become a validator (i.e., participate in the execution of a consensus protocol). Usually, such networks offer economic incentives for those who secure them and utilize some type of a Proof of Stake or Proof of Work algorithm.
Some of the largest, most known public BlockChains are the bitcoin BlockChain and the Ethereum BlockChain.
A private BlockChain is permissioned. One cannot join it unless invited by the network administrators. Participant and validator access is restricted.
A hybrid BlockChain has a combination of centralized and decentralized features. The exact workings of the chain can vary based on which portions of centralization decentralization are used.
System Integration – in information technology is the process of linking together different computing systems and software applications physically or functionally, to act as a coordinated whole. The system integrator integrates discrete systems utilizing a variety of techniques such as computer networking, enterprise application integration, business process management or manual programming. System integration involves integrating existing, often disparate systems in such a way “that focuses on increasing value to the customer” (e.g., improved product quality and performance) while at the same time providing value to the company (e.g., reducing operational costs and improving response time). In the modern world connected by Internet, the role of system integration engineers is important: more and more systems are designed to connect, both within the system under construction and to systems that are already deployed.