Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
26 Breakout Session Summaries Introduction Kathleen Hancock, rapporteur he goal of this conference was to bring together public- and private-sector experts in marine transportation and freight analytics to share ideas and to open a dialogue toward a national high-fidelity freight flow model. Breakout sessions were grouped generally by research area and category. Each session consisted of presentations that were selected from the call for abstracts, followed by questions and comments from attendees. The following sections provide summaries generated by recorders attending each session with the intent of identifying important ideas. The abstracts for the presentations are included in Appendix A. While potentially resulting in some redundancies, the two perspectives provide complimentary aspects of the information generated by this conference. The following summaries do not imply consensus, ranking, or agreement from other attendees and are provided for the convenience of the reader. Breakout Session 1A Data Analytics: Maritime and Freight 1 Donald Ludlow, CPCS Transcom, presiding Michael Pack, University of Maryland, recording Presentations Deriving Value from Waybill Data, Traffic Volumes, and the Integration of Both Charles Edwards, North Carolina Department of Transportation, and Kevin Baughman, SAS Institute Freight Production Modeling Using Census Microdata Shama Campbell, Rensselaer Polytechnic Institute Ohio Maritime Study Information Visualization Techniques Donald Ludlow, CPCS Transcom his session included presentations from a mix of USDOT, private-sector, and academic researchers and practitioners. While the high-level theme of the session was analytics, presenters also covered aspects of their work that ranged from modeling to communication and outreach. T T
27 North Carolina has the most managed state roads, next to Texas, in the United States, and considers ramifications beyond the state line as demonstrated by its investment in rail facilities outside its borders to improve freight flows within the state. The North Carolina DOT and the analytic software company SAS began a project to generate data-driven solutions that would assist the North Carolina DOT with freight planning and help inform planners on the importance of freight. Edwards outlined the objectives â¢ Inform planners about freight flows in North Carolina; â¢ Create dashboards that combine data sources; â¢ Provide planners with the freedom to filter, query, and export data from the dashboard; and â¢ Formulate linkages between disparate data sets. With the assistance of SAS consultants and software, the North Carolina DOT developed 10 freight dashboards. One example is the Waybill dashboard, which includes O-D sites and lets the user click on a waybill to see what it contains. Histograms of commodity type and value are also displayed. Another example is the vehicle counts dashboard, which includes the location of count strips, the year data were collected, and hourly profiles with basic filtering by year collected. Related to this conference, the port data dashboard shows shipments of containers out of Wilmington, North Carolina, and other ports and includes the number of shipments by month. Using this dashboard, the North Carolina DOT was able to see the effects of a shipping company going bankrupt and to gain other insights about the port. Charles Edwards of the North Carolina DOT and Kevin Baughman of SAS noted that consolidating data was a big challenge because so many departments were involved and that some of the data files were too large to be opened on a personal computer: thus the need for a more powerful tools and summary dashboards. Their next steps include developing projections for 5- and 30-year rail plans, forecasting commodity revenues by rail route, monitoring travel time reliability for major truck corridors, and analyzing O-Ds of commodity after leaving a port. Shama Campbell discussed products emanating from two reports: National Cooperative Highway Research Program (NCHRP) Research Report 739: Freight Trip Generation and Land Use,/National Cooperative Freight Research Program (NCFRP) Research Report 19: Freight Trip Generation and Land Use, and NCFRP Research Report 37: Using Commodity Flow Survey Microdata and Other Establishment Data to Estimate the Generation of Freight, Freight Trips, and Service Trips: Guidebook10. Rensselaer Polytechnic Institute developed a web-based application that generates zip code and business-level estimates of freight and service activities (FSAs) at two- and three-digit North American Industry Classification System (NAICS) codes. The software uses two zip code level data and individual business-level data and estimates: â¢ Freight production (lb/day), â¢ Freight attraction (lb/day), 10 NCHRP Research Report 739, TRB, Washington D.C., 2012: NCRFP Research Report 19, TRB, Washington, D.C., 2012; NCFRP Research Report 37, TRB, Washington, D.C., 2017.
28 â¢ Commodity flow survey freight production (lb/year), â¢ Freight shipments (shipments sent/day), â¢ Freight deliveries (deliveries received/day), and â¢ Service trip attraction (vehicle trips/day). Campbell presented some advantages of using establishment-level models and noted that results can be estimated with relatively small samples and aggregated to any level of geography. She indicated that the models are transferable around the country. Donald Ludlow presented the process for developing a maritime approach for the Ohio DOT. The study attempted to answer the three key questions: â¢ What assets comprise the Ohio MTS? â¢ Who are the existing and potential users of Ohioâs MTS? â¢ What are the options for the Ohio DOTâs role in the MTS? He focused on the economics for the Great Lakes region, which experiences much trade within the immediate vicinity of the Great Lakes but also has significant export sea trade. Information from this analysis included statistics on the use of facilities, tonnage, and economics. Forty percent of Ohioâs gross domestic product is linked to freight mostly associated with the steel industry. Other key cargo includes project cargo, such as windmills and specialized generators, grain, and aggregates. Overall tonnage on the Ohio MTS has been declining since 2001. More than half the tonnage is on the river system, with 42 percent on Lake Erie. Iron ore is the dominant commodity shipped on the lake, while coal is the dominant commodity on the river system. Traffic from bulk sectors is in decline, dragging down total MTS volume. AIS is used to track trading relationships. All data used in the study and the resulting visualizations helped to underscore the importance of specific facilities such as a particular lock that may or may not be within Ohioâs own control. Seeing these relationships helped to encourage out-of-state or multistate investment. KEY TAKEAWAYS â¢ The role of USDOT is limited largely to broader transportation policy and planning efforts along with making targeted MTS-related investments in things such as facilities, connections, and cranes. The most beneficial near-term focus for USDOT would be enabling better connectivity where maritime provides a competitive advantage and is in line with market trends. â¢ Data challenges include trying to find ways to get access to the data for daily operations. The industry is exploding right now, and data have not matured enough to support that growth. In particular, the industry needs to work on publicâprivate policies and data exchanges and on finding a way to anonymize data so that the information could be used to inform public policy without negatively affecting business.
29 Breakout Session 1B Data Analytics: Inland Waterways Joe Crabtree, Kentucky Transportation Center, presiding Jim Kruse, Texas A&M Transportation Institute, recording Presentations Inland Marine Transportation Data Integration: Extracting Additional Value from Publicly Available Data James Dobbins, FACTOR, Inc. Measuring the Network Impacts of Local Disruptions: An Inland Waterways Case Study Craig Philip, Vanderbilt University Modeling Dynamic Behavior of Navigable Inland Waterways Heather Nachtmann, University of Arkansas Quantifying the Impacts of Disruptions to the Inland Marine Transportation System Patricia DiJoseph, U.S. Army Corps of Engineers nland waterways are an important component of the transportation system, and this session considered several related aspects. James Dobbins provided examples of how to extract value from public data by using modern business intelligence (BI) platforms and analytics. In particular, data management is important and includes movement and consolidation of data, data wrangling,11 quality/cleansing, and loading the data warehouse. Dobbins then explained how BI tools perform advanced extract, transform, and load (ETL) steps12 to produce dashboards, data-driven alerts, forecasting, and automated insights. These data sets included the USACE Waterborne Commerce Statistics, the U.S. Department of Agriculture (USDA)Grain Transportation Report, National Weather Service data on river stage and flow, Corps Locks XML services, M reports, various AIS data provides, and Master Docks Plus. Dobbins noted that Excel could do much of the same analysis that he presented, but BI handles greater file sizes. BI can create maps that automatically refresh and show queue status. The types of analytics that he presented included the current conditions at a lock, the average delay for a given lock for different time periods, the lock-to-lock travel time, and a tow cut indicator. It is possible to track activity by barge tow size and by barge company. The analytics can focus on the lock or the pool and the commodity. These outputs enable the analyst to spot 11 Data wrangling is the process of transforming and mapping data from one raw data form into another format with the intent of making data more appropriate and valuable for a variety of downstream purposes such as analytics. 12 The three ETL database functions are combined into one tool to pull data out of one database and place it into another database. I
30 unusual spikes in activity or activity in unexpected locations that may indicate a business opportunity. Craig Philip noted that more frequent disruptions have occurred on the Ohio River over the past year or two and that environmental conditions have exacerbated already existing problems. He discussed network impacts of disruptions. The study team analyzed 170 locks on the main inland waterway system and selected four for detailed study on the basis of physical lock characteristics, lock performance, and network value. The primary data sets used in the analysis included Waterborne Commerce Statistics Center data, Lock Performance Management System data, Surface Transportation Board Waybill data, and GIS. The study employed several tools such as the barge costing model, rail cost model, motor carrier cost model, and ancillary costs based on a variety of sources. To determine if rail could accommodate diversions, the team developed a rail capacity concentration index. The analysis concluded that a significant amount of current northâsouth traffic would be diverted to the Pacific Northwest. In calculating regional economic effects, the team used multipliers from previous National Waterways Foundation studies. Philip noted that because of their network nature, lock impacts are not mutually exclusive. The study demonstrated a method of defining the role of locks in the waterway system and provided a way to calculate the shipper cost burden caused by lock closures and to show employment impacts by regions. Heather Nachtmann presented a regional economic impact study that focused on the McClellanâKerr Arkansas River Navigation System (MKARNS) and what would happen if the system were to be closed. The study simulates resulting impacts on navigation, hydroelectric power generation, and recreation and is informing an ongoing Regional Resilience Assessment Program of the MKARNS led by the U.S. Department of Homeland Security. The study team developed a maritime transportation simulator (MarTranS) that studies the relationships between waterway system components and economic impact factors to examine long-term system behavior over a 50-year time period. A case analysis of MKARNS indicates gross domestic product increases until 2022, when the gross domestic product declines because of increased lock/dam disruptions, and flattens in 2034, with dry cargo and dry bulk commodities having the largest impacts. Doubling the capacity of congested docks resulted in a 4 percent economic improvement over the base scenario and a 2 percent average flow increase. However, the annual sales increase did not justify the port expansion investment costs. Investing in lock/dam rehabilitation increased the life of MKARNS by more than a decade and resulted in a 53 percent economic improvement over the base scenario. The analytics provided by the simulation include trip characteristics by commodity type, port processing performance, lock processing performance, total shipment cost, holding costs, and pricing behavior. The simulator provides insights into system disruption, demand changes, lock/dam failures, part capacity expansion decisions, and channel-deepening project investments. The objective of Patricia DiJosephâs research was to measure quantitatively the impact of a waterway disruption event on vessel traffic in terms of travel time. The research used AIS data to determine baseline travel times, recovery period duration, additional travel time incurred by vessels, and postevent travel times via AIS vessel traffic data. The AIS data
31 were acquired from the USACE Lock Operations Management Application and the USCG Nationwide Automatic Identification System. The research developed and used the AIS Analysis Package. The research calculated the baseline travel time and then identified the time period with travel times above the baseline following a disruptive event and total travel time above the baseline. A case study focused on a recent failure at Lock 53. Ongoing research efforts include determining commerce impacts by coupling results to vessel freight data sets and converting travel time to costs. The team is also producing a waterway travel time map that should be available in early to mid-2019. KEY TAKEAWAYS â¢ It is possible to do much more with existing, publicly available data than has historically been done. With todayâs software it is possible to tease information out of a number of databases that provide a better picture of how the inland waterway system is performing at the lock level and as a system. â¢ The ability to evaluate the state of the inland waterway system and the reaction by users to major events requires access to trip data from USACE and the Waybill data. AIS data are available but must be purchased. â¢ Several analytical processes can be applied across a broader geographical scope, including â A lockâs role in the system and its importance to the system, â Shipper cost burden caused by lock closure, â Graphical representations that show a lockâs traffic patterns across the country and its network effect, and â Rail capacity concentration index for a selected corridor. â¢ With AIS data, it is possible to establish baseline travel times and to determine how a disruption to the system affects those times and how long it takes to return to normal.
32 Breakout Session 1C Decision Support: Resilience Josh Murphy, National Oceanic and Atmospheric Administration, presiding Sandra Knight, WaterWonks, LLC, recording Presentations 2017 Hurricanes: A Resilient Path Forward for Marine Transportation System Federal Agencies Katherine Touzinsky, U.S. Army Corps of Engineers A GIS Inventory and Exposure Assessment for Critical Coastal Transport Infrastructure Land Use in the Caribbean Small Island Developing States Gerald Bove, University of Rhode Island Barriers to Climate and Extreme Weather Adaptations for Seaports: A Cultural Consensus Model for North Atlantic Medium and High-Use Port Decision Makers Elizabeth Mclean, University of Rhode Island Using Geographic Information Science to Evaluate Legal Restrictions on Freight Transportation Routing in Disruptive Scenarios Steven Peterson, Oak Ridge National Laboratory his session demonstrated the interdependencies of sectors and organizations when decisions are being made about recovery and responding to disasters. Combining both analytics and expert elicitation, the four speakers covered various ways to assess impacts of disasters to transportation systems. Coupling the use of analytic data from AIS with lessons learned regarding freight logistics, Katherine Touzinsky highlighted the complications of decentralized information and miscommunications in providing a speedy and effective recovery for the communities impacted by Hurricanes Harvey, Irma, and Maria. Seeking to reduce the exposure of seaports and airports on islands in the Caribbean, Gerald Bove and his team assessed the current and future operational impacts of storms and sea level rise by using GIS modeling and data analysis. The criticality of these lifelines to islands makes it even more important to get the analysis right. While regional assessments can be helpful, finer resolution and better data are needed for localities to prepare. Elizabeth McClean used the expert elicitation of port authorities to assess the vulnerability and the adaptive capacity of ports to the rising threats of extreme weather and sea level rise. Lack of funding and risk awareness were the top two barriers to resilient investing. Steven Peterson demonstrated the importance of bringing analytics to policy making. Using a transportation GIS tool, he presented how the analysis could be used to understand better the impacts of policies on freight movement scenarios in a poststorm environment. T
33 Several barriers and issues were identified by the presenters. These included â¢ Poor communication and missing information during disasters, â¢ Short-term investments over long-term solutions and misaligned time horizons, â¢ Lack of ownership and leadership, â¢ Decentralized requests and access to information and data, â¢ Classified data complicating the use of remote sensing data, â¢ Lack of data resolution at the local scale, and â¢ Lack of funding and awareness keeping ports from implementing resilience measures. Each of the authors presented actionable concepts to improve resilience in U.S. ports and waterways. These included â¢ Improved multilingual communications, â¢ Better elevation data, â¢ Priority setting for resilience and recovery, â¢ Proactive identification of policy solutions and mitigative measures, â¢ Improving adaptive capacity, and â¢ Better modeling capabilities to decision makers in advance of and during events. The panel demonstrated innovation in its approaches to a more resilient future for the transportation system. Analytics, scenario analysis, and capturing lessons learned all improve decision-making, whether it is real-time response to logistics challenges, long-term planning for mitigating impacts, or the quantifying of existing and proposed policies. KEY TAKEAWAYS There were a number of key takeaways from the session: â¢ First, marine transportation plays a critical role in economic and community resilience both pre- and postdisaster. â As in the case of Puerto Rico, resilience of data and information can be as important as the resilience of the infrastructure itself. â While the San Juan port was opened quickly, data on port status and the location of freight were difficult to find. â As with Puerto Rico, because island seaports and airports are the lifelines to response and recovery in an island environment, it is important to have the detailed data needed to accurately model impacts. â¢ Some critical barriers to overcome in taking resilient actions in ports are funding, lack of risk awareness, fragmented administrations, geographical constraints, and communications. â¢ Analytics can inform transportation policy as well as help decision makers in the wake of a disaster.
34 Breakout Session 2A Data Analytics: Maritime and Freight 2 Heather Nachtmann, University of Arkansas, presiding Matthew Chambers, Bureau of Transportation Statistics, recording Presentations Container Ship Routing and Scheduling with Multiple Time Windows Anastasios Charisis, Florida Atlantic University Modeling Agricultural Commodity Flows on the U.S. Railroads and Inland Waterways System Using Waybill and Waterborne Commerce Statistics Data Steven Peterson, Oak Ridge National Laboratory Short Sea Shipping Versus Trucking: A CostâBenefit Analysis Using Mathematical Modeling Evangelos Kaisar, Florida Atlantic University ll modes of transportation are key for moving passengers and freight around the world. Freight transportation moves commodities to markets, facilitates global trade, and supports the world economy. Marine transportation is an integral part of the global supply chain. Short sea shipping / marine highways can transport freight and passengers domestically, along coastal or inland waterways, without crossing an ocean. Anastasios Charisis discussed how models can be used to optimize routing and scheduling of marine transportation, particularly liner shipping. Container ships require greater capital investments, as capacity, container volumes, and international trade have grown. By optimization of the routes of container ships, supplyâdemand, cargo allocation, and timing may reduce costs and improve utilization. Multiple time windows for port calls add flexibility and prioritize perishable cargo. Steven Peterson showed agricultural cargo flow through a multimodal network (e.g., highway, rail, and waterways). Understanding these flows in the United States helps to determine pricing and to focus infrastructure investment, which requires identifying the flowsâ origins and destinations and determining their collection and storage locations, routes, and critical infrastructure along the way. Waybills identify, in carloads and tons, the agricultural commodityâs origins and destinations as well as any junctions or transfer along the way as they are moved by rail and railcar. The USACE Waterborne Commerce Statistics helped identify the flow of agricultural commodities, and Master Docks may identify origins and destinations along the waterways. Evangelos Kaisar indicated that modal choice relies heavily on the type of product being transported, and many products utilize inter- and multimodal transportation (e.g., air, A
35 highway, rail, and waterways). Trucking is preferred in the United States but suffers from issues such as congestion, increasing costs, and shortages. Short sea shipping by methods such as barge and roll-on/roll-off is clean, efficient, and preferred in Europe. Short sea shipping is prohibitive in the United States because of higher costs. KEY TAKEAWAYS â¢ Modelsâsuch as an optimization approach to help set prices or a savings approach to minimize costs, both accounting for their underlying factorsâcan be used for decision-making. â¢ Many assumptions are made when models are being developed and running and can lead to more questions. â¢ Modelers decide on the parameters, make assumptions, and analyze the results. â¢ All models (e.g., flow, optimization) depend on data availability and quality, definitions, and computing power. Breakout Session 2B Decision Support: Managing Flows Patricia DiJoseph, U.S. Army Corps of Engineers, presiding Nichole Katsikides, Texas A&M Transportation Institute, recording Presentations Applying Multimodal Freight Network Optimization to Public-Sector Investment Decisions Mark Berndt, Quetica Data Science Approach to Bottleneck Identification: Freight Network Analytics Catherine Lawson, University at Albany Multiagency Data Fusion Informs Waterway Management Brandon Scully, U.S. Army Corps of Engineers The Voyage Plan: The Missing Link Brian Tetreault, U.S. Army Corps of Engineers his session focused on ways to analyze data on freight flows to support transportation decision-making and operations. Four speakers discussed ways to use data by conflating data, visualizing data, and applying data in nontraditional ways to glean more information than analysts have in the past. The session included a discussion of using transportation (voyage) plans as a means of gleaning information about vehicle movements ahead of time for operational improvements. Much discussion focused on how T
36 to bridge the gap between the way the public- and private-sectors use data and how to chip away at the challenges of proprietary data by demonstrating how anonymized and aggregated freight data use by the public sector would benefit the private sector through improved public-sector decision-making and investment. Mark Berndt presented an optimization model focused on demonstrating how Quetica, a freight network planning company, has been able to integrate different data flows and use granular data to estimate supply chain flows. The Iowa DOT and other public-sector agencies used this data to map supply chains and to develop plans and operations that help shorten those supply chains for economic opportunity. The modeling process involves an innovative approach to optimization, which is a mathematical method for finding the optimal solutions to improve supply chains. The results support public-sector planning to optimize supply chain networks and private-sector operations to reduce transport costs. The Iowa DOT used the method for mapping supply chains for key Iowa industries and identified some Iowa businesses that could improve their supply chains by partnering with transportation and planning entities in Iowa and thus remove some out-of-state supply chain links. If fully realized, moving the links to Iowa would shorten supply chains and create significant efficiencies. Despite demonstrated success, Berndt discussed challenges in bringing the public and private sectors together on data analysis. The private sector is looking at the trees in the forest while the public sector takes a forest, or global view. Additionally, the private sector evaluates and optimizes its supply chains regularly. Aggregate data used by the public sector are dated and out of sync with private-sector decision-making. Options are needed to marry aggregate public and detailed private-sector data so that the public sector has better, relevant data to map decision-making to business supply chains. Catherine Lawson discussed approaches to identifying bottlenecks through freight network analytics, specifically, the need for new approaches to data analytics for freight to understand freight flows and support decision-making. Like Berndt, Lawson described changes in trade flows making previous data unreliable for forecasting purposes. She further presented on tools developed through her research to support freight analysis using FHWAâs NPMRDS. The NPMRDS Performance Measurement Tool Suite, available through the Albany Visualization and Informatics Labs (AVAIL), provides the visualization tools needed to analyze and report network performance, run corridor analyses, and conduct project analyses at various geographic and temporal resolutions. Some challenges she described included the differences between data sources and the algorithms used. She described how the AVAIL tool uses three types of bottleneck algorithms; depending on which method is used to analyze data, the results can be quite different. Lawson discussed the need for a consensus approach on measuring freight analytics and special outreach to the freight community for improved urban and long-haul truck data. Brandon Scully discussed using multiagency data in conjunction with waterway data to support waterway management and using the USACE AIS data for analysis. The AIS provides vessel locations, time, and duration. USACE has used heat maps to demonstrate resulting information. Other early uses include getting a sense of volumes and where dwell was occurring. Emerging uses include matching AIS data with events such as port
37 shutdowns or storms to determine impacts on ship movements, routing, dwell, and other issues that impact ship flows. Additionally, the AIS data can help in forensic analysis, for example, in pinpointing vessel locations and movement data when a near miss or accident occurs and in tracing the corresponding vessel movements. The data can also be used to understand user response to dredging and how this can change routing and vessel movement activities. Use of the AIS data is revealing more information to provide a data-driven approach to managing the waterways similar to spatialâtemporal data from highway probes. Although there is significant opportunity with AIS data, working with freight partners is necessary to answer many analytical questions and to supplement AIS data with private-sector information to tell the best story and to use the information most effectively in decision-making for planning and waterway investments. Brian Tetreault discussed voyage planning as the missing link in decision support analytics and improving visibility of what is happening on the waterways. Voyage plans, similar to flight plans in aviation, provide a description of a vesselâs planned journey from start to finish as well as advance notices of vessel movements. A filed voyage plan would provide origin and destination data, estimated time of departure, and estimated time of arrival. This type of prefiled information would allow the public sector to be aware of what is moving, when, and by whom and thus provide increased safety and security, especially for sensitive materials. Prefiled information would also provide a way to manage waterways better, similarly to the Transportation Systems Management and Operations approach to highway management, with the transportation network operated as part of a system. Another benefit would be the ability to understand delay and to identify points of congestion on the waterway along with impacts to the transportation network as a whole, such as the ripple effect of a delayed vessel on intermodal operations. Similarly, a voyage plan system would provide improvements for industry users, such as improved intermodal coordination and supply chain visibility that would help system users understand where delays and issues are occurring for supply chain optimization. Challenges to this type of system include â¢ Information protection issues; â¢ Stovepipes within and across government agencies and industry that challenge information sharing; â¢ Technology, as there are many data systems involved and a lack of common standards and data conflation methods; and â¢ Communication issues in terms of how current reports and potential voyage plan data would be shared and communicated. Currently, USACE is working on a prototype information framework that supports the exchange of navigation information. KEY TAKEAWAYS â¢ Data and advanced analytics can inform decisions. More information can be derived from current data sources by applying advanced analytics.
38 â¢ Standards and methods are important because the information will be different depending on the methods used. The methodology used can provide different results. â¢ Marine data, such as AIS and other sources, provide important information. Applying advanced analytics can provide information about performance. â¢ USACE is working on a platform to share data between public and private sectors. Breakout Session 2C Data Analytics: Port Performance Scott Drumm, Port of Portland, presiding Donald Ludlow, CPCS Transcom, recording Presentations Container Ship Bay Time and Crane Productivity: Are They on the Path of Convergence? Shmuel Yahalom, SUNY Maritime College Container Ship Dwell Times Through the Automatic Identification System Lens Daniel Smith, Tiago Group, and Daniel Hackett, Hackett Associates Ports of the Future: Deploying Emulation and Real-Time Simulation for Identifying Technologies for Improved Port Supply Chain Performance Lawrence Henesey, Blekinge Institute of Technology The Application of Freight Fluidity Metrics to the Port Environment Kenneth Mitchell, U.S. Army Corps of Engineers he focus of this session was to introduce the latest analytical techniques to assess port performance. Presentations examined a range of data sources and approaches, from the application of AIS data to measuring port fluidity to the use of operational and simulation data to improve on-dock operations and investment decision-making. Large ships provide economies of scale at sea, but diseconomies of scale at the port because the number of containers that a gantry crane needs to handle in a cargo bay is much larger. While crane productivity has increased by 90 percent over the past 20 years, from about 20 lifts per hour to 38, container ship capacity has increased by 202 percent during the same time period, with a port productivity gap as a result. To diagnose this gap better and to help ports make better productivity investment decisions, Shmuel Yahalom introduced advanced analytical techniques to measure the discharge and load (D&L) time of a container ship bay by examining the relationship between container ship bay size and gantry crane productivity. The research examined crane productivity, measured by lifts per hour per crane. Crane productivity is affected by technology, operator skills, container location, cross and net output, different ports and terminals, and management. The T
39 research also used bay time (the amount of time to discharge and load a fully loaded cargo bay of a container ship) and berth time (the amount of time between vessel docking and undocking, which is also a function of discharging and loading). Comparing these measures, Yahalom found a direct relationship between container ship size and discharging and loading, demonstrating the port diseconomies of scale for larger container ships. He indicated that beam/bay size is the most important factor determining bay and berth time. The higher the crane productivity is, the shorter the bay time and consequently the berth time and port time. The inherent diseconomies of scale of large bays, due to the portsâ lagged productivity adjustments, result in â¢ An increase in the a number of ports of call per voyage for large container ships, resulting in a new normal and â¢ Stowage planners of large container ships avoiding a high concentration of cargo for the same port in adjacent bays and/or in one bay. The solution to these issues is to increase crane productivity via number of cranes and sophisticated spreaders. Daniel Smith and Daniel Hackett examined ship dwell times using AIS data. Their research looked at three questions: â¢ What drives container vessel dwell time? â¢ How do container vessel dwell times relate to scheduled calls? â¢ What causes container vessel delays or long dwells? Using a complete set of 2016 U.S. container vessel call records (18,500 calls), they found that vessel size is a weak predictor of dwell time, while the average 20-foot equivalent unit per call is a better predictor. The team tracked a class of identical vessels across multiple ports to demonstrate that dwell times rise as the expected cargo volumes rise. They found that â¢ Vessel dwell times are mostly clustered between 8 and 48 hours with variation in dwell times directly related to the TEUs; â¢ Early and late arrivals may lead to longer dwells for the same vessel class; â¢ Terminals may be minimizing cost within time available before outbound sailing, not minimizing dwell time; and â¢ Unplanned surges could lead to labor or equipment shortfalls and congestion. The data also showed impacts of weather. Northeast port maximum dwell times rise with inclement weather from December through March. Notably, West Coast ports do not show the same pattern. Smith and Hackett concluded that cleaner and more accessible AIS data would improve research, especially as the maritime research community gains experience with AIS data. Lawrence Henesey discussed the deployment of emulation and real-time simulation for identifying technologies to improve port supply chain performance. He showed that although ports use terminal operating systems (TOSs) to manage operations, TOSs are
40 becoming increasingly complex and costly to implement. TOSs also face development challenges, including â¢ Central control instead of central intelligence; â¢ Prevention of collisions; â¢ Direct handshakes, which requires synchronization; and â¢ Finding the optimal sequence of working orders [operations research (OR) methods]. Emulation technology has the ability to reduce the risks associated with TOS deployment because it allows for virtual testing of the TOS. Emulation technology, consisting of a simulation model of the TOS deployment, can also reduce deployment time to go live by accounting and correcting for risk, thus allowing the port to save time and money. Henesey provided several examples of the use of emulation and its positive impact on TOS deployment. Kenneth Mitchell and a team of researchers from USACE and the Texas A&M Transportation Institute developed port fluidity measures using 2014 and 2015 cargo and noncargo AIS data and Lloyds Registry data. The approach applied the collective experience of the highway freight fluidity research community to develop metrics similar to those of the port environment with the use of AIS data instead of vehicle probe data to assess port performance. The team identified the following key measures: â¢ Total port system timeâtime from arrival anchorage to exit from port; â¢ Port cycle timeâtime from entrance into channel to exit from port; â¢ Travel time inbound, outboundâbaseline travel time, normal time (25th percentile); â¢ Travel time indexâaverage of ratios of individual trip time to baseline; â¢ Planning time indexâratio of 95th percentile to baseline travel time for same; and â¢ Time above baseline travel timeâtrip time above baseline. After screening outliers, the team found that port delay, or the time above baseline travel time, is most often caused by dock activities, not by congestion (unlike highway delay). Through the research, the team observed the following: â¢ AIS data can be used to evaluate port mobility. â¢ Some data issues related to port freight fluidity measures and calculating parameters remain to be overcome. â¢ Differences exist between roadway and waterway mobility measures. â¢ Vessel type is a key factor in port mobility, for example: â Container: shorter stays and better overall performance; â Bulk: longest stays, often 10 or more days in port. â¢ More testing is needed. KEY TAKEAWAYS â¢ Port productivity is critical.
41 â¢ AIS data are shedding new light on port operations. â¢ AIS data can be used to evaluate port mobility. â¢ Ports are using emulation to simulate operations in their TOSs to build and synchronize jobs and tasks. Breakout Session 2D Decision Support: Safety Todd Ripley, Federal Maritime Administration, presiding Scott Brotemarkle, TRB, recording Presentations An Exploratory Study of Near-Miss Events in Maritime Freight Systems Using the Marine Information for Safety and Law Enforcement Database Robin Dillon-Merrill, National Science Foundation Examining Maritime Casualties from Vessel Groundings Using Ordered Probit Modeling Fatima Zouhair, U.S. Coast Guard From Text to Data: How the U.S. Coast Guard Used Accident Reports for Benefits Analysis Douglas Scheffler, U.S. Coast Guard esearchers and practitioners in the federal sector are continually seeking new ways to harness existing federally mandated data sets to understand better the activity within their agency purviews. In the marine transportation area, USCG maintains the Marine Information for Safety and Law Enforcement (MISLE) database system to warehouse data on marine accidents, pollution incidents, search and rescue cases, law enforcement activities, and vessel inspections/examinations. This session included presentations on leveraging the MISLE database to extrapolate trends and learning related to near misses, vessel groundings, and regulatory effectiveness. Robin Dillon-Merrill presented the results of a pilot project to understand whether commercial waterways users are learning from near-miss events. By using MISLE data elements that could serve as proxies for near misses, the research seeks to identify trends in this regard. Although near misses are not reported in MISLE, indirect information includes notifications that do not become preliminary investigations, preliminary investigations that do not become casualties, and casualties in which no person was injured or killed and no vessel damage occurred. To determine whether learning was occurring, the research examined the relationship between the occurrence of near misses and the occurrence of serious casualty incidents. The assessment of 150 large ports and 400 of the largest waterways by trip yielded the insight that minor casualties in a previous year serve R
42 as the most consistent predictor of more serious events. From this first broad analysis, the research concluded that near misses currently identified in MISLE may provide warning signs of future serious events and that the number of near misses from a previous year correlates with serious events in the following year. In a follow-on case study of the HoustonâGalvestonâTexas City area, which also aggregated the data by vessel type and incident subtype, the month-level results showed consistency where minor casualties were predictive of more serious events; over time, evidence of learning to reduce fatalities was observed. A future possible strategy raised and discussed during the question/answer period was the potential value of linking vessel AIS data to MISLE for additional insights and analysis. Fatima Zouhair reported that vessel groundings accounted for 53 percent of all accident types between 2002 and 2016 for U.S. flagged vessels in coastwise and inland trades. USCG examined the primary factors influencing more than 10,000 vessel groundings during this period and estimated the marginal effects of the contributing factors on injuries. Ultimately, the goal is to use this information when formulating policy or regulatory initiatives to mitigate groundings. Model variables included injury severity, initiating event type, ship type, weather condition, and time of day, among others. On the basis of ordered probit modeling, the data show that the majority of vessel-grounding accidents cause minor and moderate injuries and that the increase in minor grounding injuries was greatest for the coastwise and passenger variables. The study is part of an effort to assist policymakers in developing regulations that promote safety by improving the design of vessels and educating and training vessel operators. The audience posed questions related to whether these findings could be construed as precursors for major incidents and whether USCG could use these learnings to identify hot spots for groundings and could mark the channels with navigational aids to mitigate future accidents. By way of background, Douglas Scheffler conveyed that USCG rulemaking requires data-supported costâbenefit analyses. Although the MISLE database contains elements of accident information, the frequent sparsity of causal information creates a data gap for benefits analysis. He indicated that the goal is to bridge the root cause data gaps in the database through expert review panels and text analysis of other reports, documentation, and surveys related to casualties. The panels reviewed 300 accident reports/cases, which contained everything from witness accounts and photos to engineering reports. A consistent expert review methodology was used to integrate the qualitative data via a capture tool. A risk reduction assessment was then performed to determine whether the nascent regulation, Subchapter M, which regulates safety and compliance for towing vessels, would have been effective in mitigating the accident. Risk reduction effectiveness qualitative scores were assigned to each effectiveness category representing the parts of the Subchapter M regulations representing various categories as shown in Figure 2. The risk reduction scores were then converted to monetized benefits, shown in Figure 3, related to several categories. Total monetized benefits could then be run against the entire Subchapter M rule to evaluate whether benefits exceeded costs of instituting the
43 regulations. Questions from attendees centered on whether machine learning and algorithms could be introduced to automate some of this process. In addition, a participant asked whether the MISLE database could be improved to contain some of these critical pieces of qualitative data in the future. KEY TAKEAWAYS â¢ The lack of available near-miss/accident causation data in maritime necessitates proxy methodologies and qualitative analysis to determine correlations, trends, and predictions and to solve for data gaps. â¢ The USCG MISLE database has limitations and tremendous opportunity, including potential for deeper mining and analysis, especially if coupled with AIS data. â¢ The methodologies from the presentations have promise for use as pre- and postregulation indicators of benefits and effectiveness. Subchapter M implementation is an example. â¢ Translating the results of these studies from pure data and findings would be useful for messaging for policymakers, rulemaking justification, best practices for waterways management by a harbor safety committee, industry operators integration, and union training organizations. Breakout Session 3A Data Analytics: Maritime and Freight 3 Alison Conway, City College of New York, presiding Scott Drumm, Port of Portland, recording Presentations Cyber-Physical Applications for Maritime Freight Transportation Systems Amirhassan Kermanshah, Vanderbilt University How Information Systems and Data Are Used to Control Operations and Influence Management Decisions at the Panama Canal Parsa Safa, Lamar University FIGURE 3. Risk reduction scores to benefits value of accident consequences. FIGURE 2. Inspection of towing vessels rulemaking, Title 46, Subchapter M.
44 Transport Analysis Framework of the King Abdullah Petroleum Studies and Research Center: Building a Global Freight Network Model with Satellite and Automatic Identification System Data Hector Guillermo Lopez-Ruiz, King Abdullah Petroleum Studies and Research Center his was the third session related to the use and the promulgation of data analytics in the maritime and freight industries that focused on applications of analytics to improve decision makersâ understanding of maritime and freight mobility needs. Amirhassan Kermanshah described his work for the Tennessee DOT. His project is designed to help Tennessee DOT officials identify problems with the freight system in the state, both land and water. Three key areas in which officials are looking to use this information are (1) gateway facilitation (managing transportation facilities on the basis of current and up-coming conditions), (2) freight status (location of freight as it moves through the system), and (3) network status (understanding the performance of the system in terms of congestion and weather issues and so forth). Kermanshahâs work combines traditional sets such as the FAF and survey data with what he describes as cyber-physical (CP) data, such as that coming from asset tracking and onboard monitoring and control systems. Through his work he can estimate the costs and benefits of using technology to improve freight flow. While CP data offer the potential to reduce delay on the road network as well as at navigation locks, it also opens firms to the risk of hacking if this data stream is not secured. Parsa Safa described some work to improve the development and use of analytics for operating the Panama Canal. Though recently expanded and modernized, the canal is still relying on its usual processes and technology for daily operations. Researchers at Lamar University are working with the Panama Canal Authority to put technology into place to improve operations and management practices. Among the technologies that Safa described were automated technologies that collect and relay information and conduct analyses in real time. He underscored the importance of the kinds of analytics not just to the Panama Canal but to ports as well. The deployment of technology will improve efficiency, safety, and the overall reliability of canal operations. Hector Guillermo Lopez-Ruiz presented his work developing the Transport Analysis Framework of the King Abdullah Petroleum Studies and Research Center (KAPSARC). He developed the framework to provide the government of Saudi Arabia with information to determine how the rail industry can position itself to achieve the countryâs objective of becoming a global logistics hub. Lopez-Ruiz described how he is applying satellite data to freight flow analysis. By analyzing nighttime lighting, he has been able to identify locations of economic activity that serve as high-level freight originâdestination locations. Using AIS data, he has been able to identify the most-used shipping routes around the world. The AIS data are now available on KAPSARCâs own open source portal for researchers and practitioners to use. Future work in developing these kinds of analytics include bringing them to applicability on an urban level. This work will focus on combining satellite imagery T
45 with computer-aided image identification, enabling the locating of commercial activity areas and urban-level freight origins and destinations. KEY TAKEAWAYS Three key themes emerged from this session: â¢ Various emerging technologies can be harnessed to describe and understand freight and maritime activity and improve operations and decision-making. â¢ As transportation companies invest in and make use of technology, researchers are provided with new sources of data. However, the use of such data exposes that data and dependent operations to cybersecurity risks. â¢ One of the challenges will be integrating more traditional data sources with analytical techniques and the emerging data. Breakout Session 3B Big Data and Machine Learning: Maritime Applications Michael Pack, University of Maryland, presiding Sarah Harrison, University of Georgia, recording Presentations Enhancing Database User Experience with Natural Language Processing Dan Seedah, Texas A&M Transportation Institute Man Versus Machine: Comparing Traditional Data Collection and Statistical Models with Machine Learning Big Data Analytics Jolene Hayes, Fehr & Peers The Practical and Unified Use of Blockchain, Cyber-Security, the Internet of Things, Machine Learning, Artificial Intelligence, and Big Data for the Marine Transportation Industry: Details on Building a Multimodal Freight Analytics Platform Dean Shoultz, MarineCFO Validating Automatic Identification System Data Using Machine Learning Algorithms Edward Carr, Energy and Environmental Research Associates his session explored how big data and machine learning can be applied to marine transportation. Dan Seedah showed the development of a natural language user interface to access and to answer questions from freight databases. Ideally, this interface automatically retrieves, T
46 preprocesses, and understands heterogeneous data sources to answer a query. This type of artificial intelligence (AI) is in use across other domains, but has yet to be implemented successfully in freight analytics. To be a useful tool for researchers and transportation managers, a natural-language user interface must correctly identify key words and understand sentence structure, context, and transportation-specific jargon. Seedah used both an untrained machine learning model, which had been trained on the Message Understanding Conference, Version 7 (MUC-7) data set to address important transportation-specific fields, including person, location, organization, time, data, percentage, and money, as well as a trained machine learning model that incorporated domain-dependent fields, including location, commodity, modes of transport, links of transport, date and time, and units of measure. The untrained conditional random fields (CRF) model was poor at understanding anything except location and times, whereas the hybrid CRF model and rule-based model performed best at understanding all components of a freight query. Seedah then offered a simpler approach, an autosuggest feature to guide the user to inputting valid queries into the database. Before this kind of user interface can be implemented and used by the transportation industry, the larger issue of data congruity must be addressed. Currently, transportation data are collected by multiple agencies, stored in different formats, collected differently, and not subjected to the same level or kind of quality control. Jolene Hayes discussed how traditional data collection methods, such as surveys, compared with machine learning of big data sets, such as the GPS data from electronic logging devices. She presented the results from two case studies associated with trucks in southern California. In the first case study, Hayes performed a truck routing analysis on SR-58 in Kern County, California, to understand how new alignment of SR-58 might affect truck traffic in the Bakersfield area, such as on northâsouth routes, including SR-99 and I-5. Her team used both O-D data obtained through onboard logging devices (GPS) and the 2009 intercept truck O-D survey conducted by the KOA Corporation. A large discrepancy between the survey and the GPS results emerged. In the second case study, Hayes once again compared GPS data obtained from StreetLight Data and traditional survey methods for truck O-D to and from the Port of Long Beach and the Port of Los Angeles. The GPS methods missed nearly 90% of truck traffic to and from the the Port of Long Beach and the Port of Los Angeles and was in agreement with the previous case studyâs findings. While there are known challenges with traditional survey methods, including being noncontinuous, expensive, and infrequently done, there are similar problems in GPS data. For instance, the data do not capture the entirety of the trucking fleet and may not capture a complete trip because of signal loss or the way that the trip is counted. Hayes concluded by stating that bigger data sets are not always better if they include substantial or unexplained gaps. Additionally, operations need to be included when collection designing data is being designed to capture the full nature of the transportation system. Dean Shoultz presented a high-level overview of the challenges and opportunities for incorporating emerging technologies such as AI, Blockchain, IoT, and machine learning into a multimodal freight analytics platform. He proposed an architecture for a freight network that uses the most advanced, cutting-edge technology, shown in Figure 4, using Azure, Microsoftâs public cloud, which is already equipped with Blockchain and machine learning
47 capabilities. To be a useful tool, it must be highly scalable, resilient, elastic, and capable for ingesting, organizing, and analyzing large volumes of freight-related data from multiple data streams. Mr. Shoultz highlighted many existing features within Azure that can be used and incorporated into freight analytics. Such a platform is only as good as the data included in the platform and for private businesses, presents internet protocal (IP) and cybersecurity challenges. Cybersecurity remains a critical hurdle that must be addressed before further data can be freely shared between businesses. Edward Carr and his team applied machine learning techniques to AIS data to improve usefulness. Researchers focused on South Florida, a region with many kinds of vessel traffic. Of the 10.4 million records of AIS data, only 0.97 percent of records contained vessel type information. Carr used vessel speed calculated from AIS data, using speed over ground and course over ground to determine ship activity rather than the field of ship activity from AIS data. Additionally, he used the ship data as a time series, including deltas, averages, ranges, and standard deviations, to understand ship behavior better. He used scikit-learn, a Python languageâbased machine learning portal, and compared the performance of three unsupervised clustering algorithms: k-means, density-based spatial clustering of applications with noise (DBSCAN), and Birch. Of the three, the Birch algorithm could successfully identify the behavior of fishing vessels in the Intracoastal Waterway while in transit to key fishing sites and while actively fishing. This presentation demonstrated that machine learning could be used to identify vessel activity efficiently using existing AIS data, but further work is needed to apply this work to other kinds of vessels and to other regions. FIGURE 4. Proposed architecture for a freight network.
48 KEY TAKEAWAYS â¢ Blockchain and machine learning applications may be leveraged to improve logistics and research of the maritime domain, but this action will require access to multiple, accurate, and complete data streams. â¢ One speaker suggested that a natural-language use interface might be one solution to allowing better access to and analysis of transportation data. â¢ Access to consistent and quality data was a unifying theme across the presentations, as were concerns about cybersecurity, given the amount of proprietary data sought for these data-intense applications. â¢ While big data and machine learning are very promising, two speakers identified a key pitfall about the collection. â¢ AIS data on ships and GPS data for trucks are not as complete as anticipated, with subsequent limited utility for informing key decisions. Breakout Session 3C Decision Support: Environmental James Corbett, University of Delaware, presiding Todd Ripley, Federal Maritime Administration, recording Presentations Automatic Identification System Data Improvements to the 2014 Gulfwide Emissions Inventory Study Heather Perez, U.S. Coast Guard Merging Automatic Identification System and U.S. Coast Guard Data for Modeling Maritime Air Emissions Diane Rusanowsky, U.S. Coast Guard Multiattribute Performance Analysis for Intermodal Maritime Cargos Jim Corbett, University of Delaware he group, in an open discussion facilitated by the moderator, talked about how data might be used in identifying funding opportunities for congestion mitigation and air quality programs. The group discussed how states such as California are using AIS data for enforcement and environmental assessments. The group was provided with background regarding how and why the Bureau of Ocean Energy Management (BOEM) was tasked with performing the Gulf of Mexico emissions inventory. Heather Perez presented on AIS data improvements between the 2014 BOEM Gulf of Mexico emissions inventory study and the 2011 emission collection inventory. With the use of cleaned AIS vessel data integrated with IHS vessel data, improved air emissions T
49 modeling was achieved. This improved modeling included vessel type classification with more specific vessel type/engine combinations, actual engine loads versus defaults, highly detailed spatial and temporal resolution, exclusion of state waters activities, and no spatial allocation or assumptions as activity is calculated in place. Diane Rusanowsky discussed merging AIS and USCG data for modeling maritime air emissions and indicated that this approach (1) allows for both spatial and temporal location of various maritime source air emissions; (2) provides accurate counts of most vessels as well as clear indicators of vessel paths, speeds, and time spent in various travel modes that drive engine use assumptions; (3) lends itself to scenario analyses demonstrated for reasonable granular estimates of air emissions to compare the baseline versus scenario conditions; and (4) provides for emissions profiles from the three geographic areas that are different and reflect the maritime uses of each area. Jim Corbett presented on multiattribute performance for intermodal maritime cargos and highlighted opportunities for modal shifts for cargo to reduce air emission, energy conception, and cost, including warehousing by Right-shoring, Right-steaming, Right-routing, Right-timing, Right-bundling and Rightâmode mixing. In addition, he pointed out the shift to more localized warehousing since the 2010 recession. KEY TAKEAWAYS Innovative approaches included â¢ Using AIS data with IHS vessel data to provide increased accuracy and assessment of vessel emission in the Gulf of Mexico, â¢ Merging AIS and USCG data for modeling maritime air emissions that allows for both spatial and temporal location of maritime sources, and â¢ Use of multiattribute performance to highlight opportunities for modal shifts for intermodal maritime cargo. Data concerns and issues included â¢ Cleaning and honing of the AIS data to remove ghost or erroneous data that could compromise ship emission assessment of a region, â¢ Difficulty and impracticality of identifying and interpreting all sources for the full AIS data set, and â¢ Identification of modal shift opportunities in rapidly changing economic conditions and the dynamic evolution of the transportation system. Action items were identified: â¢ Continue to refine and evaluate AIS data and vessel information for improved BOEM ship/ vessel emission assessments. â¢ Expand research to other available data sets and tools such as GIS.
50 â¢ Explore using data and analytic analysis to identify where modal shifts for cargo and freight are beneficial and cost-effective and make sense for system users.