port of long beach
Photo 76106514 © trekandshoot | Dreamstime.com

The newly launched data platform project of the port of Long Beach is getting traction on the US west coast after the ports of Oakland and the Seattle-Tacoma Northwest Seaport Alliance (NWSA) signalled support for the concept.

Shippers have now called for a nationwide platform for container shipping data.

Less than three months after it first announced plans for a platform for container shipping data that shippers can freely access, the port of Long Beach officially started its ‘Supply Chain Information Highway’ venture during the TPM22 conference last week.

The first phase focuses on container visibility events, but will later extend to other data elements, according to Noel Hacegaba, the port’s deputy executive director.

The launch followed a pilot progamme that involved one terminal operator at the port and several transport providers and shippers which saw data transmitted, questioned and validated.

Mr Hacegaba acknowledged that shippers and importers require such data from more than one port, as most of them use multiple gateways to import or export their traffic.

“Creating a shared digital platform will provide decision makers timely, comprehensive and quality data,” said Bryan Brandes, maritime director of the port of Oakland.

“Having this cargo visibility tool can help speed the supply chain and set a digital foundation for improving goods movement.”

Mr Hacegaba remarked that the Supply Chain Information Highway would not compete with existing digital platforms for ocean cargo. The system is meant to complement these, he said.

The port of Los Angeles already has a container data platform in operation that offers port users a mix of free and commercial tools, such as average dwell times and truck turn times.

Shippers, meanwhile, are ultimately interested in a national platform for marine transport data. The National Retail Federation has been one of the vocal champions of this push.

That desire is shared by the likes of FedEx and Amazon, which have stressed the need for such a tool with a uniform set of standards.

The lack of uniform, consistent data results in inefficiencies in supply chains and in higher costs, they have argued.

Their comments came at a recent session organised by the Federal Maritime Commission (FMC), which is holding a series of meetings, interviews and round table sessions in an effort to identify “data constraints that impede the flow of ocean cargo and add to supply chain inefficiencies”.

The ultimate objective of the undertaking, led by FMC commissioner Carl Bentzel, is to come up with recommendations for common data standards used in the international shipping supply chain and to suggest access policies and protocols to streamline information sharing across the container supply chain.

“Events of the past year have proven the need for the United States to achieve more capacity from our cargo delivery system. Information sharing and additional transparency in how containers move is one way we can move more containers more efficiently,” stated FMC chairman Daniel Maffei.

The initial findings of the current round of information gathering are due to be presented at a Maritime Data Summit this spring, the FMC signalled.

Some observers have suggested a national data platform for ocean cargo could be fed by information captured by customs through the automated customs entry system. In many cases this could provide advance information before a US-bound vessel reaches its destination port.

Comment on this article


You must be logged in to post a comment.