Digital Marketplaces Unleashed
Page 87
In the age of digitalization, data seem to be ubiquitous and endless. However, data are both under‐ and overvalued. They are undervalued because data‐generating systems provide endless amounts of bits and bytes that are neither linked nor organized and therefore cannot be turned into power. Then again, they are overvalued as the whole world gets excited about “the power of data”. In a narrow sense, a digital marketplace needs to be established between data owners from different departments as many of them are not used to sharing their data. In a wider sense, that marketplace would include every single stakeholder: merchants, airlines, suppliers, public authorities like customs and the police and, most important, passengers.
Major hub airports are under considerable economic strain because of two main reasons. The first one results from the growth of Low‐Cost‐Airlines and thus from a significant change in price structure for legacy Airlines like Lufthansa or Air France. The second reason results from massive capacity expansions of Middle East airports like Dubai, Abu Dhabi or Istanbul which cause noticeable shifts on market shares regarding transfer passengers. For example, the route from New York to Singapore can nowadays be operated using different stopovers. Finally, the decision about which stopover will be chosen is made based on the price and experience of former trips. Since the majority of passengers at Frankfurt Airport are transfer passengers and Lufthansa operates most of the flights, Fraport is greatly affected. Recently, additional reasons like changes in the geopolitical situation and fear of terrorism restrain passengers from travelling via Frankfurt Airport have come up.
In order to compete on price and quality with Middle East airports, Fraport has to optimize their processes permanently and react fast on changing requirements. Therefore, Fraport coined a new instrument to identify potential improvements and possibilities for optimization based on data analysis – the Smart Data Lab. The term “Smart Data Lab” is widely used to characterize a type of innovation lab that uses data as resources and develops new business ideas from that data, s. Fig. 57.1.
Fig. 57.1Smart Data Lab as the first step in the transformation process
In reality, many Smart Data Labs rather create algorithms that are more or less useful to be applied in business, which sooner or later affects the acceptance of the labs. We claim that a Smart Data Lab can and should be viewed as something much more fundamental. A Smart Data Lab provides the key to open the digital marketplace. Focusing on concrete problems, with a clear factual, temporal and also spatial concentration, the lab itself could function as a hub, connecting not only data, but also experts and ideas. The major task is not a technical one. It is a political one. It is the task to persuade stakeholders to share and trade their data in a digital marketplace in order to create more value for everyone involved and to make that value clearly visible.
In our opinion one of the characteristics of Smart Data Lab is the “non‐sugarcoated truth” of data and data analytics. Possible optimization of processes, (labor) changes or any improvements resulting from a Smart Data Lab may unsettle or upset the affected department and/or the management. This requires undoubtedly that the Smart Data Lab has to prove that any of its results is derived from analytics of data only and is not biased by human retention, defense or justification of current practice. It’s not about blaming someone or someone’s processes – it’s about optimization and getting new insights.
57.2 Data Structures at Airports
The main system of an airport is the Airport Operational Database (= AODB). It consolidates all relevant data for flight operations from different sources. In this context, the AODB operates as an integrator of all relevant information as well as a delivery hub to provide that information to all stakeholder. Data are coming from different sources like sensors, business process applications, radar sources, airlines, public authorities, ground handlers or freight forwarders. The main business object is a flight, uniquely identifiable by airline, trip number, arrival/departure indicator and its scheduled date. With approximately 1400 flights per day that looks like a rather small amount of data on the main business object. But considering that each flight has over 5000 attributes and gets updated over a thousand times during its lifecycle, a significant volume of (at least in part) real‐time and heterogeneous data needs to be stored and managed reliably. Data describing the airport core processes will still reside in a low terabyte range and not in petabyte dimensions.
Those core business relevant data can roughly be classified in the following categories listed in Table 57.1. Table 57.1Categories of and examples for business relevant data at airports
Category
Examples
General flight data
Airline, aircraft, gate, airport, runway
Passenger data
Total on board, transit, transfer, reduced mobility
Flight operation data
Arrival and departure time, deicing, taxi time
Baggage data
Dangerous goods, palettes, animal transportation
Freight data
Coordinates overall activities
Passenger handling
Security, check‐in, boarding, border control
Ground handling services
On/off‐loading, passenger transportation via bus, baggage transportation, freight transportation, supply and fueling, aircraft push‐back, special orders
The data consist to a great extent of timestamps and numeric information and are stored in a complex data model composed of more than 450 tables. Managing and providing this information for reporting and free data analysis requires very experienced data managers who have deep expert knowledge of the business and the ability to match this to the technical data model. For all data analysis projects, the data manager is an essential resource and a key factor for the success of the project.
Since most of the data are structured and stored in a relational format, the size of this information is comparably small in contrast to other industries. Rather than handling big volumes of data, the main effort for airports is to manage the variety of sources and therefore the complexity of data integration and the velocity of incoming streaming data for real‐time data analysis and decision making.
Besides flight relating data, other business departments like human resources, finance, project management or real estate management also possess a lot of interesting data. However, those data are stored separately. For years, data has enjoyed special observance from controlling and reporting since they are mostly based on standard applications and standard business procedures. Free data analysis and data exploration wasn’t on the spotlight so far but the linkage of both data platforms will be the next big thing to address.
57.3 Airport Analytics Now and Then
Data is the new oil of the 21th century. This slogan and similar ones fuel the expectations of data analysis to help companies master their challenges and easily solve all their problems. It has to be proven in the future whether this approach really works out for all potential scenarios, but as of today there are some good examples demonstrating the possibilities of capturing insights and finding the solution for business problems inside a company’s raw data. However, high expectations and the urge to engage in the field of data analytics was occasionally rather harmful than helpful. Since the top management became aware of the idea that data might be an asset, it has often ended up in doing things for the sake of doing things and significant but haphazard investments in analytics related facilities and services, like hardware/infrastructure, software/tools and consultancy.
Data analysis is not new at all. In the past, data analysis relied on sophisticated statistical methods to extract even the tiniest drop of information from scarce data. Now data is abundant and analytics faces very different problems, like how to link huge and heterogeneous amounts of data and how to separa
te signals from pure noise. Companies not typically receptive for this topic now slowly start to change their organizations and build novel structures concerning both IT and human resources, often after they have experienced costly failures when first dealing with data and analytics in the way described above.
Fraport therefore created the Smart Data Lab and established it within the entire organization. Some of the expectations towards the lab focused on its more direct results, as for example the hope for discovery of unknown correlations that would lift the potential for quality improvement, cost reduction or profit increase. Data was supposed to generate clear and objective decision bases. Furthermore, expectations also stretched out to more indirect effects. The networking of employees with high logistical and mathematical skills would hopefully promote an innovation culture, reduce silo mentality and support a non‐hierarchical development of human resources.
Before we describe two of the projects the Smart Data Lab worked on, which we will do in the next chapter, we will characterize the concept and implementation in short. The Smart Data Lab is primarily supposed to be an agile and innovative laboratory environment which would operate independently from existing hierarchical structures and allow experimental exploration and free trials of new ideas based on data analysis. From the beginning, failure was not only an option but was seen as an opportunity to learn and do better (fail‐fast mentality). Interdisciplinary teams with cross‐qualifications perform data‐driven solution finding in an independent work unit that can be mandated by all departments and subsidiaries. Most important, the basic rule is that of a guard room free of restrictions.
All departments of Fraport can use the Smart Data Lab if they submit a precise problem statement and provide skilled employees. The latter is very useful to foster commitment and acceptance of the lab’s work by the departments. It also makes clear that the lab institution and its services are extremely valuable, if not to say precious, because whoever wants to benefit from it needs to contribute. Last, but not least, skill and knowledge transfer throughout the whole company are guaranteed.
Questions submitted can be of strategic, tactical or operational nature. An example for a strategic problem was the question of how the positioning of flights at the gates affects retail revenues and which contractual conditions result from this insight, concerning future negotiation with airlines. A tactical problem mostly encompasses the improvement of equipment and staff planning and an example for operational situations is the prediction of more accurate arrival times during flight approach which could help optimize ground handling processes.
The Smart Data Lab is conducted once a year and involves five important phases – selection of relevant questions, clarifying the problem situation, analysis phase, presentation of results and transition into production. We present these phases in the following subsections.
57.3.1 Selection of Problem Statements
Every business department has the chance to submit a problem or research question to the Smart Data Lab. At first stage a small team from the Smart Data Lab consisting of employees from corporate development and IT department verify the relevance and feasibility of execution. Finally, the executive board selects four questions out of the list which guarantees the compliance and commitment according to corporate priorities and strategic objectives.
Now that the Smart Data Lab is well established, there is usually a wide range of problems and research questions to choose from. The business departments and the executive board have been convinced that using predictive analytics will help them make smarter, earlier decisions that address a wide range of business challenges. This is usually not the case when the lab is initiated for the first time. It may be hard to persuade a department to be the first to submit a problem, mostly for political reasons. A department needs to be persuaded that at least some of their biggest business challenges might be susceptible to a predictive approach and that it is worth a try. “This persuasion task is probably more difficult than any technological issues that might come up [2].”
57.3.2 Clarifying the Problem Situation
In multiple workshops with the Smart Data Lab team and a group of qualified employees the problem is discussed in detail and is rendered more precisely. Typically, people from the relevant business department start to explain the procedure of the business process including exceptions and standard activities. At that point it is very helpful for the entire team to take some time and observe the process in reality or visit control centers and planning offices to sharpen the comprehension for the problem.
Besides a good notion of the problem itself it is also very useful to understand the motivation and possible benefit of solving the issue. The goal of this phase is to achieve a fine grasp for all team members and to determine the expectations from the business department. It has turned out to be helpful trying to define possible solutions classified as “gold, silver and bronze”. This classification reveals the basic minimum requirements determined for a bronze solution and the more advanced, but nice‐to‐have features covered in the gold solution. Since time for analyzing the problem is strictly limited, the classification provides a useful orientation for the team during the next phase.
57.3.3 Data Analysis Phase
In the data analysis phase the Smart Data Lab is called together for four weeks in total. During that time, it is very important that all members are completely released from daily business so they can focus on working in the lab. This is inevitable as those four weeks typically are the most intense and time consuming phase of all phases during a lab cycle.
Up to 15 members work self‐organized using agile methods like Kanban in a collaborative manner. The members are assigned to one primary problem but are willing to work on other issues as well if help is needed. The members are marked with the following special skills described in Table 57.2. Table 57.2Roles and associated skills in the Smart Data Lab
Role
Skill profile
Data scientist
Comprehensive knowledge about data mining, statistics, data engineering and advanced computing
Business analyst
Visual data exploration skills and statistical knowledge
Data expert
Knows where to find the information tracking and describing a business situation within the data model, i. e. related variables and their meanings, codes, documentation issues, etc.
Business expert
Represents the business department, knowledge about background, business demands and existing decision rules
Project manager
Coordinates overall activities
The Smart Data Lab serves as a guard room during that phase in order to protect the members and the insights collected from outside influences. It also guarantees the idea of a free and experimental working ethic where it is possible to fail without fearing any consequences. All members get unlimited access to all company data and are willing to use any available tool they feel comfortable with. In this way, a maximum of efficiency and effectiveness is achieved.
It should be mentioned that it takes substantial time and effort to reach that status of a “neutral place” within the company. During the first lab phase we experienced some intense discussions with department managers who feared that the lab might uncover failures in their past and current decision practice. It is often ignored that changing decision processes from experience‐ and command‐driven to data‐driven decision making would potentially attack hierarchical structures. We cannot emphasize enough the importance of communication, political work and sensitivity, as a Smart Data Lab, if it should function as intended, would initiate a change process for the organization as a whole which requires professional change management.
Change, however, is also necessary in terms of skill profiles of the analysts, especial
ly when the focus of analytics turns from description to prediction or even prescription [3]. Descriptive analytics seeks to answer questions like: what and when did something happen? (Almost) no understanding of the underlying process or data is needed and particularly, time frames are irrelevant for the analysis. On the next stage, diagnostic analytics, one looks for explanations: why did it happen. To avoid the confusion of correlation with causation, the data‐generating business process needs to be understood. When it comes to the next stage, predictive analytics, it is often necessary to transform data structures significantly, e. g. from a transactional data set to a data set consisting of customer signatures and from an ex‐post to an ex‐ante perspective [4]. Time plays an important role when it comes to questions like: what is likely to happen next? It can become a difficult and time‐consuming task to re‐build the necessary historical views, requiring professional skills in data management. Finally, prescriptive analytics in the last stage supports decisions: how can we make things happen? A model of the business process, its single components and their interactions is needed in order to optimize it. Therefore, skills in operations research and optimization techniques, e. g. numerical methods, are crucial.