Skip to main content

Frequently Asked Questions

General Questions​

note

Which requirements must be met by the vessel’s IT infrastructure?​

The Hoppe Ship-to-Shore connection does not provide its own VSAT connection. Therefore the client must ensure to have a VSAT connection available. For the communication with the land-based servers, the following IP addresses on the corresponding port for outgoing TCP data traffic of HOMIP2 must be enabled in the vessels firewall.

IPPort
Primary Address75.2.111.19211550
Fallback Address99.83.166.21611550

A detailed checklist can be found in the Ship-to-Shore Fact Sheet provided in the Download section of our product page.

note

Is it possible to load external or historic data into the data pool via interface?​

In terms of data storage, Hoppe Marine utilizes the structure of a β€œData-Lake”. It is based on a concept in which structured data can be loaded into the data pool from a variable source. In this term structured data means e.g. log data, telemetry data etc. In the field of data analysis, time series data in various formats are the most relevant data. Currently, the following formats are accepted: CSV, JSON, SQLite and Parquet. Furthermore, it is important that the client defines the column name-assignments, due to the fact that for evaluation the data must be set to a universal naming standard.

The following example can be given for illustration:

The automation system of a vessel exports data in the following format:

Time stamp              Main Engine Power       Main Engine Speed
21/06/2018 11:09:33 20111 76

The data must be transformed into a standardized format for better handling. The target format is therefore of the following:

ti.utc.act.ts@last      me.pow.act.kw@avg       me.rev.act.rpm@avg
1529579373.00 20111 76

The translation of the source data into the target format is thereby defined by so called "Mapping". Please find details about the structure of Mapping in our API documentation (https://docs.hoppe-sts.com). Hoppe Marine is delighted to assist during the initial data import into the data pool and the initial creation of the Mapping.

Security​

note

Is the Ship-to-Shore transmission secure?​

The Ship-to-Shore transmission utilizes a multi-level safety concept. This concept distinguishes between Identity Protection, Access Protection and Integrity Protection.

Identity Protection​

The first level of the safety concept ensures trustworthiness of the communication partners. Thereby every end point of communication is fitted ex works with a private, cryptographic key. This key never leaves the device and cannot be compromised therefore. Only when a correct key is known the device gets enabled to transmit or receive data.

Access Protection​

When it is ensured that data comes from a trustworthy source, a secure TLS-encrypted connection between the communication partners will be established in the next step. This encrypted connection prevents access from third parties, only both communication partners are able to read the data in clear format.

Integrity Protection​

After successful transmission of data a further step is implemented to ensure the data is intact and corresponds to the data that has been sent from the vessel. Therefore cryptographic signatures according the industrial standard RFC 7519 are used.

note

How and when does the encryption take place?​

Service/DeviceEncryption
Datastore on HOMIP2None, but secured by Debian system security and RBAC scheme.
Export data filesCryptographically signed with elliptic curve private key.
Data transport to shoreTLS encryption in transit.
Data store on shoreAES encryption at rest. No in-memory encryption during data handling.
Data distribution to customer via APITLS encryption in transit.

Data Handling​

Data Transmission​

note

How is the data transmission working?​

The data transmission is file-based. The interval for data exports can be configured. For data transmission a satellite connection is established and for transmitting the data, a direct, encrypted connection to fixed IP addresses without DNS is realized. All files are collected as database files and transmitted together in blocks. By using the Hash-method, the entire file content is subject to an integrity test. The transmission of encrypted data underlies a bandwidth regulation.

note

What kind of data transmission volume between the ship and the shore can we expect?​

Studies with regard to data volume revealed the following rough relations between number of signals to be recorded and transmitted data volume:

Number of logged signalsLogging rateRaw export data volume per 24hCompressed export volume per 24hEstimated data volume per month
30060 s3.5 MB0,7 MB21 MB
30010 s21 MB4.1 MB123 MB
100010 s69 MB14 MB420 MB
10001 s691 MB138 MB4140 MB

Example calculation:

  • 240 signals for tank content measurement of 60 tanks (height, volume, mass, density), 240 data points logged once per minute
  • 40 nautical signals e.g. GPS, wind, speed, heading... = 40 data points logged once per minute
  • 50 main engine signals e.g. RPM, torque, power, 12 x TC RPM, 12 x exhaust temp, 12 x something else. About 50 data points logged every 10s
  • 20 high resolution data points of the vessel motion state. About 20 signals logged once per second.

Total data file size:

  • Volume uncompressed: 20MB/day
  • Volume compressed: 4MB/day

Depending on the export interval, an additional transmission overhead needs to be considered (see "How big is the transmission overhead?").

For the above mentioned example transmitted every 6 hours a data volume of 1 MB, the transmission overhead would be less than 10%.

note

What happens when the integrity test fails? Will the entire file content, in which the integrity test has been​

failed, be transmitted again? Yes. But this case only happens to files where the transmission cannot be continued. In the "normal" case of a connection failure, the transmission will continue where it got interrupted. Only in case of a high amount of package-losses, the transmission will be repeated completely. A bandwidth detection prevents any transmission, in case the risk of a weak transmission is detected. Therefore, the complete repetition of a transmission is only the last resort. The maximum amount of transmission attempts can be determined per file. Thus, expensive and endless attempts are avoided. When a better connection quality is detected, the transmission can be restarted via a web interface directly on board.

note

What is the origin of the timestamp for time series data?​

The time series data is logged by Hoppe Marines iDB server, which uses the master time on the HOMIP2 as the time source. This time source is configured on the HOMIP2 display and is independent of the date/time of the operating system. If the customer has a GPS receiver on board, the HOMIP2 can also use the GPS time signals as the time source. GPS time signals are preferred as time source, since they provide a well established times standard. Additionally, chances to enter the date/time incorrectly are minimized.

note

How big is the transmission overhead in addition to the transmitted amount of data?​

Encryption, connection set-up, etc. have been optimized already. However, one should always consider the following: The smaller the single exported data blocks are (i.e. the shorter the export intervals are), the bigger the relative overhead is. This can be seen similar to a letter that is always 80ct., no matter if it is one A4 page or three A4 pages. The following image indicates the measuring results of the interrelation between payload size and transmission overhead. It is well indicated that the relative overhead becomes smaller in case of encryption and transmission of bigger file sizes.

note

Why is time correctness essential for meaningful high quality data​

Our goal is to log and provide ship operation data accurately and consistently. This allows in retrospective to identify the exact date and time at which an event occurred or what the signal value was at that moment. This cannot be done if the time source is not reliable. As a worst case example, if the time has changed back, duplicate data can occur for the same given time range. If this occurs, meaningful report can not be generated.