Adaptive integration time surface algorithm for event-based star tracker

Home » channel02 » Adaptive integration time surface algorithm for event-based star tracker
Adaptive integration time surface algorithm for event-based star tracker

Adaptive integration time surface algorithm for event-based star tracker

Aiming at the attitude tracking failure issue of the traditionalstar sensors, which is caused by blurry star images in high dynamic environment, an adaptive integration time surface algorithm for event-based star tracker is proposed to obtain clear star point imaging in highly dynamic environment. Firstly, based on the noise characteristics of event streams, a spatiotemporal filter is constructed to remove the noise. Secondly, a fitting method based on elliptical function features is designed to fit the characteristics of star points within virtual frames. The integration time is adjusted online based on the geometric shape features of the star points.Finally, various experiments in dynamic environments are designed. The experimental results demonstrate that the proposed algorithm can generate clear star point images and effectively eliminate motion blur of star points in high-dynamic environment. In high-dynamic environment with an angular velocity of 20 °/s and a linear velocity of 400 mm/s, compared to the commonly used event stream integration method, the angular distance error of the proposed algorithm is reduced by 77.9% and 70.7%, respectively.

 

Star tracker is a high-precision attitude measurement instrument in autonomous attitude measurement systems that targets stars in the sky. It has advantages such as high accuracy and long working life, effectively ensuring the reliable flight and operation of spacecraft. It has important strategic significance in aerospace applications such as deep space exploration, remote sensing, and formation flight. Traditional star sensors usually use CCD or CMOS image sensors, which are based on frame imaging mode. They capture the distribution of stars in the field of view during exposure time, and complete spacecraft attitude measurement and tracking by extracting star centroids and matching star maps. The large maneuvering of the carrier can lead to phenomena such as blurred star map shooting and trailing star points, seriously affecting the accuracy of star point centroid extraction and limiting the use of star sensors.

For the motion blur problem of traditional optical star sensors, commonly used deblurring methods include Wiener filtering, RL (RichardsonLucy) algorithm, and improved Radon method. These methods mainly solve the deblurring problem of traditional visual sensors, which are prone to noise interference, complex algorithms, and have unsatisfactory deblurring effects when the dynamics are too large.

Event cameras are sensors that are sensitive to changes in light intensity to achieve asynchronous pixel triggering. Compared with traditional frame cameras, their imaging paradigm has changed, featuring high dynamic range, high temporal resolution, and low power consumption, making them particularly suitable for motion detection. Therefore, they have been used in fields such as scene reconstruction, unmanned driving, and object tracking. Given the outstanding advantages of event cameras, their applications have gradually expanded to the field of space exploration. Cohen et al. conducted varying degrees of detection experiments on stars and planets using event cameras, testing and verifying the feasibility of continuous real-time celestial body determination using event cameras. Afshar et al. proposed a detection and tracking method aimed at processing spatial target data collected by event cameras, including feature based detectors and sequential least squares trackers that can simultaneously track multiple targets. Chin et al. studied the relative attitude calculation technology scheme for star tracking using event cameras, and the research showed that it has lower power consumption and faster operating speed compared to traditional sensors. Lv Yuanyuan et al. used event cameras to detect spatial targets and established and validated a sensitivity model for event camera spatial target detection, providing reference for observation experiment settings. Zhou Xiaoli et al. studied a denoising algorithm for spatial object event flow data, providing a foundation for subsequent spatial scene visual applications.

Due to the asynchronous imaging characteristics of event cameras, it is necessary to integrate the amount of events within a certain time window to construct a time surface image. Considering that the integration time window is related to the relative dynamics of the target, in high dynamic environments, selecting an integration time that is too long can cause blurring and trailing of star spot imaging; If the integration time is too short, it will result in insufficient star spot features and difficulty in recognition. In practical applications, the dynamic environment is complex and it is difficult to match all scenes with prior setting values. Therefore, this paper proposes an event star sensor adaptive integration time surface algorithm to meet the demand for clear imaging of star spot features in dynamic environments. This algorithm performs elliptical fitting on the star point targets formed by integration after denoising, and adaptively adjusts the integration time based on the fitting results, achieving effective extraction of star point targets in dynamic environments and improving the dynamic detection ability of the star sensor.

1 Dynamic vision sensor

Dynamic Vision Sensor, also known as event camera, originated from frame based silicon retina design. Unlike traditional cameras that generate complete images at a fixed frame rate, event cameras are independently sensitive to changes in light intensity for each pixel. When the brightness change of a pixel exceeds a preset threshold, an event is triggered, as shown in Figure 1.

Fig.1 Schematic diagram of events generated by a single photosensitive pixel unit

The event flow uses address events to represent the protocol, where I in 1 represents photocurrent. It can be seen that each pixel of the event camera independently responds to changes in its logarithmic photocurrent (used to characterize brightness L). The technical solution of the event star sensor is shown in Figure 2,

Fig.2 Event based star sensor work flow chart

Unlike traditional star sensors that accumulate incoming photoelectrons during exposure time to form grayscale star maps, event camera based star sensors need to denoise the output asynchronous event stream and integrate it to form a virtual star map frame, thereby completing a series of steps such as centroid extraction, star map recognition, and attitude calculation. Among them, the selection of integration time is one of the key factors to ensure the accuracy of centroid extraction in high dynamic environments. If the integration time is too short, it may lead to the weak structure of the target in the virtual frame, which cannot fully express the target features. On the contrary, if the integration time is too long, it will lead to a trailing phenomenon in the target, seriously affecting the ability of star map recognition. A suitable integration time can construct clear virtual frames of star maps, making them have obvious target features without excessive displacement and motion blur. The setting of integration time is closely related to the dynamics of the target. When relative motion is intense, setting a shorter integration time can obtain clear point targets; When the relative motion is relatively gentle, a long integral setting should be made to obtain clear features of the star points. However, the dynamic situation of relative motion cannot be obtained in advance, making it difficult to set a unified integration length for various scenarios. This article focuses on the problem of relative dynamic uncertainty and studies the adaptive integral time surface algorithm to ensure clear imaging of star maps in complex dynamic environments.

2 Adaptive Integral Time Surface Algorithm

Under the background of starry sky, the star point target is weak, and the star point event stream data has sparse characteristics. A single event point contains too little information, making traditional image processing methods unable to be directly applicable. Therefore, it is necessary to integrate the star point event stream to form a virtual frame and construct the geometric structure of the scene, thereby increasing the target feature information. The star event stream is the output of event camera motion over a continuous period of time. By integrating the event stream, N virtual frame images are generated.

The problem with using a fixed time is that if the integration time is chosen too long, it will cause events to stack and the target to have a tail; On the contrary, if it is too short, it will result in sparse target events. Another commonly used method is to accumulate a fixed number of k events to form virtual frames. The disadvantage of this method is its lack of globality, making it difficult to set appropriate global k-values for complex scenes with multiple features. To address this issue, a star event flow integration method, the Adaptive Integration Time Surface Algorithm, is proposed. This algorithm adaptively adjusts the time window by judging the star target features within the virtual frame. Figure 3 introduces three star event flow integration methods. The virtual frames under three time axes represent the results of these three integration methods. The time interval between fixed time windows and fixed number of events, as well as the global number of events, are fixed. The adaptive integration time surface algorithm can dynamically adjust the integration time window, making it more suitable for high dynamic scenarios.

Fig.3 Three event stream integration methods

(1) Event flow denoising based on spatiotemporal filters

Due to factors such as its own components, circuit design, and environmental interference, the output event stream of an event camera will contain a large number of noise events. These noise events may be mistakenly identified as star points, seriously affecting the operation of subsequent algorithms. Noise events have the characteristics of randomness and isolation, and are randomly distributed on the imaging plane; The real events generated by star motion have continuity and persistence in time and space, and are always clustered and distributed within a certain period of time. Therefore, using the spatiotemporal correlation of events and constructing a spatiotemporal filter can effectively remove noisy events, as shown in Figure 4.

Fig.4 Schematic diagram of spatiotemporal filter denoising

The spatiotemporal filter searches for timestamps in the spatial neighborhood of events that are closer to the threshold dT for processing event timestamps. If there are events with a timestamp closer to the processing event timestamp than the threshold dT, the processing event is supported and can pass through a filter, otherwise the processing event will be filtered out as noise.

For each event, take the following steps:

1) For each event, store the timestamp of the event in the timestamp memory of all 8 adjacent pixels and overwrite the previous timestamp;

2) Check if the current timestamp is within the previous value dT of the timestamp written to this event location. If the time difference is less than dT, the event is retained as a valid event and output, otherwise it is removed as background noise.

(2) Fitting Based on Elliptic Function Feature Parameters

Event streams transmit information through event clusters, where the movement of stars and changes in light intensity generate closely related events in time and space, known as event clusters. The geometric shape of the event cluster generated by star points is elliptical, so least squares ellipse fitting can be used to obtain information such as the geometric center, major and minor half axes of the star point event cluster.

The star point target is approximately circular in the event camera imaging, while in a high dynamic environment, the star point event cluster is approximately elliptical due to motion blur imaging. Figure 5 (a) shows the stars in the field of view, Figure 5 (b) shows the star point event stream output by the event camera, and Figure 5 (c) is the virtual frame obtained by integrating the star point event stream. An ideal virtual frame, where each cluster of star points should have an approximate circular shape, can be used to determine whether the current integration time window is suitable based on the geometric shape characteristics of the star points. If it does not meet the requirements, the integration time can be further adjusted

Fig.5 Star point in field of view, star event stream and star event stream virtual frame

In order to solve the problem of star motion blur caused by improper integration time selection in high dynamic environments of event star sensors, this paper proposes an adaptive integration time surface algorithm for event star sensors. This method is based on the geometric shape characteristics of star points in virtual frames, achieving adaptive integration time selection in high dynamic environments. Based on the ellipse function feature parameter fitting method, the spatiotemporal correlation of the event flow is used for denoising, and then the denoised event flow is integrated. The integration time is adjusted according to the geometric shape characteristics of the stars within the virtual frame. Through experimental verification, this method can select the optimal integration time in high dynamic environments and obtain virtual frames that eliminate motion blur.

 

Send us a message,we will answer your email shortly!

    Name*

    Email*

    Phone Number

    Message*