In this paper, a solution for estimating the relative position and orientation between two ships in six degrees-of-freedom (6DOF) using sensor fusion and an extended Kalman filter (EKF) approach is presented. Two different sensor types, based on time-of-flight and inertial measurement principles, were combined to create a reliable and redundant estimate of the relative motion between the ships. An accurate and reliable relative motion estimate is expected to be a key enabler for future ship-to-ship operations, such as autonomous load transfer and handling. The proposed sensor fusion algorithm was tested with real sensors (two motion reference units (MRS) and a laser tracker) and an experimental setup consisting of two Stewart platforms in the Norwegian Motion Laboratory, which represents an approximate scale of 1:10 when compared to real-life ship-to-ship operations.

## Introduction

Ship-to-ship, also called vessel-to-vessel, operations are used to transfer payloads or personnel from one ship to another. Such operations can be severely limited by weather conditions, especially for smaller size ships, which can have significant weather-induced motions. Compensation systems such as dynamic positioning ((DP), see for example Refs. [1] and [2]) and active heave compensated (see for example Ref. [3]) cranes and winches can be used to extend the so-called weather window.

In order to achieve safe and accurate load transfer by operator-guided automatic control systems, or fully autonomous systems, the relative motion between the two ships must be measured in six degrees-of-freedom (6DOF) and in real-time. It is highly critical that the measurement of the relative motion between the two ships is reliable and cannot be interrupted during the operation. If this measurement is lost, it may cause severe damage to the payload, personnel, and/or the ships.

The proposed solution to the ship-to-ship measurement problem relies on the use of sensor fusion techniques to combine both inertial and visual sensors in real-time. Figure 1 is used to illustrate the problem, where two ships are laying alongside each other, and a crane is supposed to land a load onto the secondary ship's cargo deck. The figure indicates that there is a visual sensor capable of measuring the absolute pose between two coordinate frames attached to each of the ships.

In addition, there is a motion reference unit (MRU) sensor placed on each of the ships capable of measuring the body movement of each of the ships relative to their respective heading frames. The sensor fusion goal is to estimate the body-to-body pose, meaning that the heading offset has to be estimated using a sensor fusion approach since it is not directly measured by the sensors. In the case where the two ships are moored to each other or that they are controlled by some DP system, the heading offset is more or less constant or at least slowly varying. By using this information, it should be possible to lose the sight of the visual marker attached to the secondary ship for small time periods and still be operational by relying on the last estimated heading offset and the inertial measurements carried out by both the MRU sensors. It is also assumed that there exists a wireless communication link between the ships for real-time sensory data transfer. This has been experimentally investigated and validated earlier in Ref. [4]. In Sec. 2, the ship-to-ship kinematics will be elaborated in more detail, the process and measurement models will be presented, and a state estimation algorithm using the extended Kalman filter (EKF) will be presented in the end.

In order to increase the reliability of the relative motion measurement, a sensor fusion approach is proposed in this paper by utilizing two different measurement principles: (1) A visual (time-of-flight) sensor measuring directly the relative motion and (2) an MRU located on each ship. These two sensor types were selected because the visual sensor allows the system to be operated from one ship only, even if the communication with the second ship is lost. An MRU sensor on each ship was selected for additional safety, since the visual sensor may be interrupted by water spray or other obstacles, for example, the payload itself. One requirement of the sensor fusion algorithm is the ability to operate with only one sensor system and to automatically include the second sensor system when it becomes available after downtime. The use of two MRU sensors introduces drift in the estimated relative motion over time. The sensor fusion algorithm should be able to eliminate the drift when the data from the visual sensor are included.

A higher level of autonomy is expected in future offshore load handling and shipping activities in general. In Ref. [5], it is stated: *Autonomous shipping is the future of the maritime industry. As disruptive as the smartphone, the smart ship will revolutionize the landscape of ship design and operations.* One specific case considered in this paper is the load transfer from one ship onto another one using an offshore crane where the load is suspended from the crane tip. Today, these operations are carried out by experienced and highly skilled crane operators with long experience. It is therefore of high interest among the industry to equip these cranes with autonomous systems, which are supposed to assist the crane operator in such offshore operations. The result may be a higher level of repeatability, safety, and more efficient load transfers. As a first step toward solving this problem, the relative ship-to-ship motion estimation has to be solved in real-time.

## Relative Vessel Motion Detection

### Ship-to-Ship Kinematic Definitions.

In this section, the relative ship-to-ship kinematics will be investigated in order to form a fundamental understanding of the relative motions occurring between the two ships while situated at sea. This kinematic model will form the basis for both the process and the measurement models used for the relative state estimation problem presented later on. The notation and the rigid body ship kinematics are following Fossen's robotic like approach for hydrodynamics, see Ref. [6] for more detailed information related to these topics. The kinematic structure between the two ship bodies is illustrated by Fig. 2.

*n*} and the body fixed frame {

*b*}. This rigid body transformation can be parameterized using the following transformations:

*b*} relative to the inertial frame {

*n*}. Velocities are usually described using body fixed velocities, at least for the rotational velocities $\omega b/nb$; this is motivated from the fact that the time derivatives of the Euler angles $\Theta \u02d9nb$ do not give any logical meaning to human understanding. The transformation between these two representations is given by

*n*}, which will result in the following velocity Jacobian matrix:

*c*} and {

*p*}, it is possible to also determine the relative motions between {

*b*

_{1}} and {

*b*

_{2}} from direct calculations according to

where $rb2/b1b1$ represents the position of {*b*_{2}} relative to {*b*_{1}} given in {*b*_{1}}, and $Rb2b1$ describes the orientation of {*b*_{2}} relative to {*b*_{1}}. The type of sensor used to track all 6DOF between {*c*} and {*p*} can vary and may be based on laser technology like the one used in Ref. [7] or by using a camera vision system together with some kind of fiducial marker like the Aruco marker presented in Ref. [8] and applied in Ref. [9]. However, this solution cannot be seen as a robust solution alone, especially not when situated in harsh offshore conditions where sea spray, varying light conditions and physical objects may interrupt the line of sight of the visual tracker for unknown time periods. In addition, this solution does not account for how the bodies are oriented with respect to gravity, which may be disadvantageous when it comes to controlling a hanging load from one ship onto another.

By extending the measurement loop to also account for the measurements carried out with an MRU sensor placed on each of the two vessels, a more robust solution to the relative tracking problem is feasible. This may even allow for periods where it is not possible to track the other ship visually using a visual tracker, which introduces redundancy compared to the previously mentioned solution. To solve this problem, the two MRUs and the visual tracker have to be combined using a sensor fusion approach. Both the system process and the measurements have to be modeled in order to apply sensor fusion methods such as the EKF or the Particle filter [10,11].

### Process Model.

*b*} relative to ${n},\u2009v$ describes the velocities and $v\u02d9$ the accelerations, where the details of the state vectors are given in Eqs. (4) and (5). By using Eq. (4), it is straightforward to form the ship process model as

*n*

_{1}} and {

*n*

_{2}}. It is also assumed that both ships are controlled by a DP system, meaning that the offset position $rn2/n1n1=[\Delta x,\Delta y,\Delta z]T$ and the orientation $Rn2n1=Rz(\Delta \psi )$ are assumed to be slowly varying or close to constant during operation. The heading offset state $xo$ parameters are augmented as

where all the kinematic variables are illustrated by Fig. 2.

### Measurement Model.

*c*} and {

*p*}, but the Leica laser tracker is used since it offers state-of-the-art accuracy as previously investigated in Ref. [7]. The two MRU measurement models are defined as

where the subscript (*n*, *m*) indicates row *n* and column *m* in rotation matrix $Rpc$. The measurement model is highly nonlinear and requires the Leica tracker to measure both the position and orientation of {*p*}. It is also worth mentioning that all the rotation matrix components are included in the measurement model even though one might consider using the quaternion approach instead to obtain fewer measurement equations.

### Process and Measurement Covariances.

*σ*is tuned according to the datasheet of the MRUs and the Leica tracker, and where the datasheets did not provide sufficient information, physical experiments have been conducted to find suitable parameters for the standard deviation of each sensor output signal. The standard deviation has then been found from comparing the sensor output with the Stewart platform feedback signals, and hence the standard deviation of the specific signal is found as

_{x}where *N* is the number of data samples from the experiment, *μ _{e}* is the mean value of the error trend

*e*, and

*e*is

_{i}*i*th error sample, which is formed from taking the difference between the sensor measurement

*y*and the high precision Stewart platform feedback signal

_{m}*y*which is used to benchmark the sensor precision.

_{s,}It should be noted that the tuning of the process covariance $Q$ is, in general, hard, at least when compared to the measurement covariance $R$. However, it may be tuned to give “satisfactory” response of the EKF algorithm by trial and error procedures. The trial-and-error tuning principle is motivated from knowing that lower process covariance compared to the measurement covariance will make the EKF to follow, or trust the model more than the sensor measurement, and vice versa if the sensor covariance is lower than the process covariance, the sensor measurements tend to dominate the EKF output. In our experimental work, the measurement convenience is more or less known due to datasheets and previous experiments, and it have therefore been decided to first find the measurement covariance matrices $Rmru$ and $Rleica$ and then tune the process covariance $Q$ to yield satisfactory response of the EKF output by a trial and error procedure.

### Sensor Fusion Using Extended Kalman Filter.

*T*, the discrete state transition function is given by

_{s}where $x\u0302k\u22121|k\u22121$ indicates the previous state estimate and $x\u0302k|k\u22121$ is the predicted state estimate. The actual implementation of the EKF is described using Algorithm 1.

From the algorithmic implementation, it is seen that $Hk,\u2009Rk$, and $yk$ change in size from each correction given that the Leica tracker is available or not. This feature enables relative ship motion tracking even when the Leica tracker loses visual sight of the tracking probe. The resulting tracking performance with and without the Leica tracker is presented in the experimental section.

## Ship Motion Simulation

*S*(

*ω*) is the Pierson–Moscowitz wave spectrum. A comparison between the wave energy spectrum and the linearized spectrum is given in Fig. 3. Both the ships are supposed to be controlled by a dynamic positioning-system, meaning that the control objective defined as:

where $Kp$ and $Kd$ are the controller gains, which have to be tuned to give satisfactory response. The control gains are calculated from comparing coefficients of the closed-loop characteristic polynomial $p(s)=f(\tau ,KSS,Kp,Kd)$ and the desired closed-loop characteristic polynomial $pd(s)=(s+pd)2$.

## Experimental Lab Setup

To carry out the experimental work to validate the proposed sensor fusion algorithm, the Norwegian Motion Laboratory is used for the experimental testing, see^{1} and [7] for more detailed information about the experimental test setup. In Fig. 4, the lab facility is depicted to illustrate the physical size of the laboratory. The Motion Laboratory consists of two 6DOF Stewart platforms, which are used to simulate two ships moving asynchronously alongside each other. The motion trajectory used to prescribe the motions of the two platforms is generated in real-time using the simulation model presented in Sec. 3. The simulation model was realized using simulink and implemented on a real-time target, specifically the Beckhoff CX2040 Industrial personal computer. The simulink model was compiled to real-time compatible code using Beckhoff's real-time module export tool for simulink/matlab. Both Stewart platforms and the sensory equipment are connected to the CX2040 and hence the experiment was carried out using this device. A brief overview of the lab setup is given in Fig. 5.

The industrial robot placed on top of the biggest Stewart platform remained inactive during all the experiments. However, the size of the robot is used to define the scale of the whole lab setup. When comparing the maximum operational reach of the robot ≈2.5 m and MacGregors's three-dimensional compensated crane^{2} which is stated to be ≈25 m, the lab scale is believed to represent a scale of 1:10 when compared to a real-life ship-to-ship operation.

The figure also illustrates the real-time 6DOF laser tracker {*c*} and the tracking probe {*p*} placed onto the small platform. The two MRU sensors are placed on each of the platforms and are calibrated to measure the body movements of both the platforms {*b*_{1}} and {*b*_{2}} relative to their neutral position (heading frames) {*n*_{1}} and {*n*_{2}}. The communication/control interface of both the platforms, both the MRUs and the control unit (CX2040), are realized using an Ethernet connection together with the User Datagram Protocol (UDP). The Leica laser tracker is interfaced through a deterministic EtherCAT connectivity. The overall communication update cycle is interrupted at 5 ms and each cycle is logged to a JSON file for offline postprocessing.

## Experimental Results

The proposed sensor fusion algorithm has been realized numerically using matlab and the logged data from the experimental lab setup has been used as inputs to the algorithm. The process and measurement covariances were tuned to give a satisfactory response. It should also be mentioned that the heave motion generated from the simulation described in Sec. 3 had to be scaled down in order to not violate the maximum stroke length for the Stewart platforms. As discussed in the problem formulation, the visual sensor may be interrupted by environmental disturbances such as sea spray, fog or other obstacles; hence, Algorithm 1 is designed to also account for such events. For a total test period of 150 s, the first 50 s all three sensor measurements are used in the correction phase, in the time period $t\u2208[50,100]$, the Leica laser tracker was deactivated to simulate that the tracker lost visual sight. In the last 50 s, the Leica tracker was activated once again to simulate that the visual sight was established after a period without sight.

*n*

_{1}}. The mathematical representation of the EKF output is given as

where $e$ represents the error of the estimated EKF output $y\u0302$ when compared to the internal high precision feedback sensors of the two Stewart platforms $y$. The estimation error $e$ is divided in two main parts, where the position error $(ex,ey,ez)$ consists of the error in tracking the relative position in x-, y- and z-direction between the two bodies {*b*_{1}} and {*b*_{2}}. The orientation error angles $(\varphi e,\theta e,\psi e)$ are representing the error in the relative body-to-body orientation. The error time series of the previously mentioned signals are given in Figs. 6 and 7 where the error trajectories are illustrated as a function of time. The time period $t\u2208[50,100]$ is highlighted to show the reduced tracking performance when the Leica tracker is deactivated or unavailable to the EKF algorithm.

From analyzing the positional error time series given in Fig. 6, it is evident that the positional tracking performance is degraded significantly when the Leica tracker is unavailable to EKF algorithm presented in Algorithm 1. This result is more or less expected since the MRU sensors fitted to each of the Stewart platforms are not as accurate, when compared to the Leica laser tracker which offer state-of-the-art precision. Especially, the surge, sway, and yaw directions are more inaccurate due to the fact that the internal MRU algorithm is not capable to produce the same accuracy as for the roll, pitch, and heave signals, according to the Kongsberg's MRU datasheet [16] and from previous experiments conducted with the MRUs in the motion laboratory.

As mentioned earlier, Fig. 7 claims that the orientation angles between the two body coordinates {*b*_{1}} and {*b*_{2}} are not affected as much as the position error in Fig. 6 when the Leica laser tracker is unavailable to the EKF algorithm. This is again due to that both the roll and pitch angles are relative robust and accurate compared to the positional MRU output. All the experimental results of applying this test scenario are quantified in terms of root-mean-square (RMS) error *e*_{RMS}, maximum absolute error $|e|MAX$, standard deviation *σ _{e,}* and mean deviation

*μ*, where Tables 1–3 summarize the EKF performance for all three test periods.

_{e}## Discussion and Conclusion

In this paper, estimation of the relative motion between two ships in six degrees-of-freedom has been demonstrated by experiments. The sensor fusion algorithm is based on a discrete implementation of the extended Kalman filter using different sensor types (inertial measurements and time-of-flight). When both sensor types are available for the sensor fusion algorithm, the maximum absolute error in position was less than 10 mm with a standard deviation of less than 2 mm. The maximum orientation error was found to be less than 0.2 deg with a standard deviation less than 0.05 deg. For 50 s in the middle of the experiment, the time-of-flight sensor was made unavailable to the sensor fusion algorithm to demonstrate the reliability and the redundancy of the proposed solution. When the system was using only the two motion reference units (one on each motion platform), the maximum absolute error in position increased by a factor of five to about 50 mm with a standard deviation of less than 12 mm. The maximum orientation error increased by only 25% to about 0.25 deg with a standard deviation less than 0.08 deg. The resulting ship-to-ship tracking performance both with and without the use of the Leica tracker should still be acceptable even though if the error is multiplied with a conservative factor of 10 due to the lab scale of approximately 1:10.

The results demonstrated in this paper show that real-time sensor fusion based on the extended Kalman filter with an update cycle time of 5 ms is achievable using two standard off-the-shelf MRUs (Kongsberg/SEATEX MRU H, Trondheim, Norway), an industrial PC (Beckhoff CX2040, Verl, Germany) and laser tracker (Leica AT960, Aarau, Switzerland). The achieved accuracy was 50 mm and better in position and 0.25 deg and better in orientation, as verified by using the internal sensors in the two motion generators (Stewart platforms). The resulting sensor fusion system presented in this paper aims at supporting the higher level of autonomy expected in future marine operations. In addition, it may contribute to future development of crane controllers capable of compensating for the secondary ship motion in real-time when transporting cargo from one ship onto another. Such crane capabilities may increase the weather window where such operations are allowed today, and hence also be economically beneficial if ship-to-ship operations can be carried out more timely without the need to wait for suitable weather conditions.

In future development of the ship-to-ship state estimation problem, the relatively high-cost MRU sensors and the Leica laser tracker could be substituted with less costly sensors and the overall performance could be compared to the one presented in this work. In addition, one might consider including the two ship's physical parameters, either as static known parameters or as unknown parameters, which have to be estimated in real-time by augmenting them in the state vector.

## Acknowledgment

The research presented in this paper has received funding from the Norwegian Research Council, SFI Offshore Mechatronics, project number 237896.

## Funding Data

Norges Forskningsrad (237896).