Future Electronics – FLIR Lepton® LWIR thermal imaging camera demo on Microsemi IGLOO®2 creative board

By: Future Electronics System Design Center

Often engineers choose microcontrollers over Field Programmable Gate Arrays (FPGAs), because they assume that MCUs are cheaper, easier to programme and more reliable. The truth is, you can make microcontrollers with FPGAs, but not the opposite. An FPGA is a device which enables a developer to synthesize digital circuits. It can be reconfigured, reprogrammed, and redesigned in millions of different ways to fit your needs. The key difference between these devices is that FPGAs are optimised for parallel, pipelined design, while microcontrollers are optimised for serial design. Nevertheless, FPGAs can also synthesize serial systems, as in the demo discussed here: thermal imaging on the Microsemi IGLOO2 Creative board.

FLIR Lepton LWIR thermal imaging camera and Microsemi IGLOO2 FPGA
The Future Electronics’ System Design Center (SDC) developed a thermal imaging video streaming demo using the FLIR Lepton LWIR thermal imaging camera and the Microsemi IGLOO2 Creative board, as shown in Figure 1.


Fig. 1: FLIR Lepton LWIR thermal imaging camera mounted on Microsemi IGLOO2 FPGA

The video streaming is in a cascade of three main blocks, shown in Figure 2:
1. continuous collection of pixels captured by the FLIR thermal sensor over a Video over SPI (VoSPI) protocol
2. data processing by the FPGA
3. display on a PC Application (GUI).

FLIR Lepton thermal module
The thermal camera module, with a resolution of 80 x 60 pixels, is the most compact Long-Wave Infrared (LWIR) sensor available as an OEM product: it measures 8.5mm x 8.5mm x 5.6mm. The LWIR camera module is also ten times less expensive than a traditional IR camera. The thermal camera is controlled by the FPGA and it streams a continuous sequence of VoSPI frames following a synchronisation event, which is triggered and controlled by the FPGA. Provided that synchronisation is maintained, a VoSPI stream can continue indefinitely. The Libero SoC project has a camera control interface dedicated to establishing synchronisation.


Fig. 2: FLIR Lepton thermal module and Microsemi IGLOO2 FPGA block diagram

IGLOO2 FPGA Libero SoC project
The biggest challenge of the project is system synchronisation. Designing with an FPGA is like playing with a building block toy; your idea is a collection of many elementary blocks, which combined together will build the system. The first block is the Camera Control Interface (CCI), as shown in Figure 3, which controls the timing to the control unit of the system. This block is enabled after 185ms delay to ensure the synchronisation of the thermal sensor, when the FIFO is empty and the UART is ready to transmit. These last two conditions are useful because the data needs to be stored in an SRAM to have enough time to acquire one full frame before being transmitted to the UART. The FIFO is a set of a FIFO controller and SRAM. The FIFO controller transfers one byte at time from the CCI to the SRAM. Once the SRAM accumulates one video frame, the FIFO controller transmits everything to the UART, emptying the memory. These designs can be seen as a water filling algorithm, like the one used in communication systems design.


Fig. 3: Block diagram of FLIR Lepton thermal imaging on Microsemi Creative board IGLOO2

Another aspect to take into account is the speeds of the CCI and the UART. The CCI has to work within a range from 2MHz to 20MHz, as specified in FLIR’s Lepton datasheet; and the UART needs to operate at a frequency which ensures that the packets are not overwritten and the CCI is enabled at the right time. Therefore, the CCI frequency is set at 20MHz, and the UART frequency at 24MHz with a baud rate of 460800 baud.

To make synchronisation management easier, the CCI communicates with the thermal sensor and executes four main synchronisation actions, specified in the FLIR Lepton datasheet:
• De-assert the chip select and the SPI clock for at least 5 frame periods (>185ms) to ensure that the VoSPI interface puts the Lepton in the proper state to establish, or re-establish, synchronisation.
• Assert the Chip Select and enable the SPI clock to allow the Lepton to start transmitting the first packet.
• Examine the ID field of the packet, identifying a discard packet.
• Continue reading packets. When a new frame is available, which should be less than 39ms after asserting the Chip Select and reading the first packet, the first video packet will be transmitted. The master and slave are now synchronised.

The CCI starts the communication with the thermal module and on receiving the packets it starts processing them. In its default configuration, the camera will transmit a packet in a format of 164 bytes long with 4 bytes dedicated to the ID and CRC, and 160 bytes to the payload, as shown in Figure 4. The payload represents the temperature value of 80 pixels encoded on 14 bits. Each pixel of a packet or line is defined on 2 bytes.

4 bytes4 bytes160 bytes

Fig. 4: Generic video packet

As mentioned in the Lepton module specifications, at the beginning of SPI video transmission until synchronisation is achieved and also in the idle period between frames, Lepton transmits discard packets until it has a new frame from its imaging pipeline. The 2-byte ID field for discard packets is always xFxx (where ‘x’ signifies a ‘don’t care’ condition), this is shown in Figure 5.

xFxxxxxxxDiscard Data (same number of bytes as video packets)

Fig. 5: Discard packet

If a discard packet is detected, the CCI disables the communication with the thermal sensor, raising the SPI Chip Select and stopping the SPI Clock for the length of the packet. Instead, if a valid packet is detected, it is sent to the FIFO, stored in the SRAM and sent to the UART as soon as a full video frame (60 packets) is acquired.

A loss of synchronisation can be caused by three main violations, as shown in Figure 6:
• Intra-packet timeout. Once a packet starts, it must be completely clocked out within 3 line periods.
• Failing to read out all packets for a given frame before the next frame is available.
• Failing to read out all available frames.

Screen Shot 2017-08-28 at 2.08.51 PM

Fig. 6: Synchronisation diagram

It is clear that synchronisation is difficult to manage. Furthermore, due to the complexity of the synchronisation logic, the design is low-level oriented, minimising multi-paths and combinatorial processes.

Cascading to the CCI, the FIFO interfaces the system control unit with the UART. One video frame is stored before being transferred to the UART, reducing the complexity of the synchronisation logic. The FIFO consists of a FIFO control unit, with the Prefetch option enabled, and an external SRAM memory. The goal of this design is to make the system faster and efficient so the Lepton will be able to keep its synchronisation.

Therefore, the FIFO control and the SRAM are both set to read and write 60 x 164 bytes (one full video frame). Both have a writing frequency comparable to the CCI, 20MHz, and a reading frequency comparable to the UART frequency, 24MHz. The different speed is due to the fact that the writing is related to a serial interface, the CCI; read operations, on the other hand, take place via the UART. Once the FIFO is full, the UART with a frequency of 24MHz starts receiving data and sending it to the PC through the USB cable at a baud rate 460800bps.

PC application
The USB throughput is a continuous sequence of packets 164 bytes long while the video is a continuous sequence of 80 pixels long, 160 bytes, see Figure 7, ordered by increasing ID number in 1 Video Frame (60 row table), see Figure 8; therefore, as the first action, the GUI decimates the video frame to 60 x 160 bytes.

Byte 0Byte 1Byte 2Byte 3Byte 158Byte 159
Line mLine mLine mLine mLine mLine m
Pixel 0Pixel 0Pixel 1Pixel 1Pixel 79Pixel 79

The image is built using the same principle of a brush wiping a screen from the left to the right, starting from the top until reaching the right corner on the bottom, as shown in Figure 8.


Fig. 8: One video frame, 60 x 80 pixels

The video streaming contains information about the temperature of each section of the image, represented with different colours, see Figure 9.


Fig. 9: GUI video streaming

The Lepton uses a histogram-based AGC algorithm, which converts the full-resolution thermal image into a contrast-enhanced image suitable for display using a linear mapping from 14 bit to 8 bit. However, when a scene includes both cold and hot regions, linear AGC can produce an output image in which more pixels are mapped to either full black or full white with very little use of greyscales (8-bit values) in between, as shown in Figure 10. By default, the histogram used to generate Lepton’s 14-bit to 8-bit mapping function is collected from the full array.


Fig. 10: Histogram for a 3 x 3 pixel area

In conclusion, the design of a serial system, with high synchronisation complexity, can be resolved by building elementary blocks and then combining them in a puzzle of synchronisation signals, processing and control processes. For this type of design it is very important to analyse the system and the Microsemi IGLOO2 Creative board and the FLIR Lepton LWIR thermal imaging camera hardware to optimise the final solution.

Orderable Part Number: Microsemi IGLOO2 Creative Board