OV5640 with STM32F429 over DCMI (part I)

Picture is worth a thousand words, as they say. Right now it doesn't matter, who are "they", I had to have some sort of opening statement, right?
Anyhow, I've spent quite a bit of time trying to get OmniVision OV5640 running, and have decided to document some of my experience.


First - why this module?

Without having any previous experience with camera modules, I was looking around for something, that would have reasonable resolution for modern times (at least 5MP) and auto-focus. Googling around, I found out, that there aren't that many options with autofocus. Well, pretty much none at all, apart from OV5640-AF module.

Naive as I was, I thought I could just get a module, plug it in and some code later have my nudes pictures. Off I went to ebay, got the cheapest, realized, that there's absolutely no documentation for given module. Darn.

OK, let's try throwing money at it. Went and bought ArduCAM shield V2, bunch of different modules, even USB interface module. Cost me a pretty dime. Connected it all up, seems to work. At least I know that it works, now what? Reverse engineer their boards? Nah, throw more money at WaveShare folks, who have made open-source camera boards and have some code that works with their STM32 boards. Much easier.

Turns out, there are a few particulars with cameras.

Camera BUSes

Most of the cameras above 2MP are MIPI CSI (Mobile Industry Processor Interface Camera Serial Interface, yes, a bit redundant), which is waaaay too fast to deal with without a decent processor - in the range of gigabits. So that leaves slower parallel interface, called DVP (Digital Video Port), that uses 8-10 parallel data lanes, horizontal and vertical synchronization lanes and pixel clock, to shuffle data. Some of the STM32 controllers have DCMI (Digital Camera and Media Interface, I guess) peripheral, that can handle these transfers straight to memory via DMA. 

STM32 DCMI interface has 14 data pins (D[13:0]), PIXCLK pixel clock from the camera, HSYNC for line synchronization and VSYNC for frame synchronization. For configuration purposes an SCCB (Serial Camera Configuration Bus) is used. Pretty much I2C, it turns out. Apart from that, power, typically multiple different voltages (sensor, inbuilt DSP, focusing motor controller, flash, etc).

Wiring it all up

Since trusty Discovery board I have has DCMI pins wired to screen, had to make myself a minimalistic devboard - MCU with garniture (power, clocking, interfaces) and bring out DCMI pins for WaveShare board. Of course, I messed up the pinout and had to hotwire with jumpers. Oh, well. For testing will do fine. Might as well test noise tolerance with that tangle. Power on the camera module gets downcoverted from 3V3 using LDOs, so didn't have to bother with that. Also needed was a crystal oscillator in range of 6-27 MHz, can try feeding it from MCU pin. Shutdown and reset pins can be pulled down and up respectively, if no active control is necessary. Optimization for later.

Since Chinese who are making those AF modules didn't bother to bring out all 10 bits of DVP bus, but only 8, there's a recommendation to connect 8 data pins to D[9:2] pins instead of D[7:0]. Not sure why, will figure it out later, perhaps.

Also we'll need enough RAM to store the image. I have written about setting it up before already.

Basic communications

As it turns out, there is some documentation available online, including OV5640 datasheet (plastered with "NDA", "CONFIDENTIAL" and such watermarks) dating back from 2011. Also there are 2 versions of appnote obviously written by someone not very fluent in English or technical writing. WaveShare also provides schematics and sample code for configuring and communicating with the camera. Sadly, sample code is mostly just binary blobs without much description, some of the registers (by the looks of it, half at most) are described in datasheet. Quite a few of register descriptions are obfuscated by "DEBUG" indicator, while in code you see setup commands as "0x370c, 0x02,                //!!IMPORTANT". 

I tried asking OmniVision for full datasheet or other technical documentation, but they weren't interested in my couple of hundred a year order quantities. Oh, well. Guess I'll have to figure it out on my own.

As I mentioned before, setup is done over I2C, sending 16bit register address and then 8bit data. Just take the big-ass lists of registers and upload them to the camera and then it starts clocking out pixel data. Depending on settings in magic number blobs, you get different resolutions, FPS and clock speeds. Figuring out what is what can be left for later, right now we just want to get something. 

Once we are sure that we can communicate with the camera (at least read out CHIP_ID registers), we need to set up DCMI peripheral.

DCMI_HandleTypeDef hdcmi;
DMA_HandleTypeDef hdma_dcmi;

/* DCMI init function */
void MX_DCMI_Init(void) {

	hdcmi.Instance = DCMI;
	hdcmi.Init.SynchroMode = DCMI_SYNCHRO_HARDWARE;
	hdcmi.Init.PCKPolarity = DCMI_PCKPOLARITY_RISING;
	hdcmi.Init.VSPolarity = DCMI_VSPOLARITY_LOW;
	hdcmi.Init.HSPolarity = DCMI_HSPOLARITY_LOW;
	hdcmi.Init.CaptureRate = DCMI_CR_ALL_FRAME;
	hdcmi.Init.ExtendedDataMode = DCMI_EXTEND_DATA_8B;
	hdcmi.Init.JPEGMode = DCMI_JPEG_ENABLE;
	if (HAL_DCMI_Init(&hdcmi) != HAL_OK) {
		_Error_Handler(__FILE__, __LINE__);
	}

}

void HAL_DCMI_MspInit(DCMI_HandleTypeDef* dcmiHandle) {

	GPIO_InitTypeDef GPIO_InitStruct;
	if (dcmiHandle->Instance == DCMI) {

		__HAL_RCC_DCMI_CLK_ENABLE();
		__HAL_RCC_DMA2_CLK_ENABLE();

		/**DCMI GPIO Configuration
		 PA4     ------> DCMI_HSYNC
		 PA6     ------> DCMI_PIXCK
		 PD3     ------> DCMI_D5
		 PE4     ------> DCMI_D4
		 PH9     ------> DCMI_D0
		 PH10     ------> DCMI_D1
		 PH11     ------> DCMI_D2
		 PH12     ------> DCMI_D3
  		 PI5     ------> DCMI_VSYNC
  		 PI6     ------> DCMI_D6
 	 	 PI7     ------> DCMI_D7
		 */

		GPIO_InitStruct.Pin = GPIO_PIN_4 | GPIO_PIN_6;
		GPIO_InitStruct.Mode = GPIO_MODE_AF_PP;
		GPIO_InitStruct.Pull = GPIO_NOPULL;
		GPIO_InitStruct.Speed = GPIO_SPEED_FREQ_LOW;
		GPIO_InitStruct.Alternate = GPIO_AF13_DCMI;
		HAL_GPIO_Init(GPIOA, &GPIO_InitStruct);

		GPIO_InitStruct.Pin = GPIO_PIN_3;
		GPIO_InitStruct.Mode = GPIO_MODE_AF_PP;
		GPIO_InitStruct.Pull = GPIO_NOPULL;
		GPIO_InitStruct.Speed = GPIO_SPEED_FREQ_LOW;
		GPIO_InitStruct.Alternate = GPIO_AF13_DCMI;
		HAL_GPIO_Init(GPIOD, &GPIO_InitStruct);

		GPIO_InitStruct.Pin = GPIO_PIN_4;
		GPIO_InitStruct.Mode = GPIO_MODE_AF_PP;
		GPIO_InitStruct.Pull = GPIO_NOPULL;
		GPIO_InitStruct.Speed = GPIO_SPEED_FREQ_LOW;
		GPIO_InitStruct.Alternate = GPIO_AF13_DCMI;
		HAL_GPIO_Init(GPIOE, &GPIO_InitStruct);

		GPIO_InitStruct.Pin = GPIO_PIN_9 | GPIO_PIN_10 | GPIO_PIN_11 | GPIO_PIN_12;
		GPIO_InitStruct.Mode = GPIO_MODE_AF_PP;
		GPIO_InitStruct.Pull = GPIO_NOPULL;
		GPIO_InitStruct.Speed = GPIO_SPEED_FREQ_LOW;
		GPIO_InitStruct.Alternate = GPIO_AF13_DCMI;
		HAL_GPIO_Init(GPIOH, &GPIO_InitStruct);

		GPIO_InitStruct.Pin = GPIO_PIN_5 | GPIO_PIN_6 | GPIO_PIN_7;
		GPIO_InitStruct.Mode = GPIO_MODE_AF_PP;
		GPIO_InitStruct.Pull = GPIO_NOPULL;
		GPIO_InitStruct.Speed = GPIO_SPEED_FREQ_LOW;
		GPIO_InitStruct.Alternate = GPIO_AF13_DCMI;
		HAL_GPIO_Init(GPIOI, &GPIO_InitStruct);

		/* DCMI DMA Init */
		/* DCMI Init */
		hdma_dcmi.Instance = DMA2_Stream7;
		hdma_dcmi.Init.Channel = DMA_CHANNEL_1;
		hdma_dcmi.Init.Direction = DMA_PERIPH_TO_MEMORY;
		hdma_dcmi.Init.PeriphInc = DMA_PINC_DISABLE;
		hdma_dcmi.Init.MemInc = DMA_MINC_ENABLE;
		hdma_dcmi.Init.PeriphDataAlignment = DMA_PDATAALIGN_WORD;
		hdma_dcmi.Init.MemDataAlignment = DMA_MDATAALIGN_WORD;
		hdma_dcmi.Init.Mode = DMA_NORMAL;
		hdma_dcmi.Init.Priority = DMA_PRIORITY_HIGH;
		hdma_dcmi.Init.FIFOMode = DMA_FIFOMODE_ENABLE;
		hdma_dcmi.Init.FIFOThreshold = DMA_FIFO_THRESHOLD_FULL;
		hdma_dcmi.Init.MemBurst = DMA_MBURST_SINGLE;
		hdma_dcmi.Init.PeriphBurst = DMA_PBURST_SINGLE;
		if (HAL_DMA_Init(&hdma_dcmi) != HAL_OK) {
//			Error_Handler(__FILE__, __LINE__);
		}

		__HAL_LINKDMA(&hdcmi, DMA_Handle, hdma_dcmi);

		/* DCMI interrupt Init */
		HAL_NVIC_SetPriority(DCMI_IRQn, DCMI_IRQ_PRI, DCMI_IRQ_SUB_PRI);
		HAL_NVIC_EnableIRQ(DCMI_IRQn);

		/* DMA interrupt init */
		/* DMA2_Stream7_IRQn interrupt configuration */
		HAL_NVIC_SetPriority(DMA2_Stream7_IRQn, DCMI_DMA_IRQ_PRI, DCMI_DMA_IRQ_SUB_PRI);
		HAL_NVIC_EnableIRQ(DMA2_Stream7_IRQn);
	}
}  

We set up pins for DCMI alternate function, configure peripheral and add DMA. You really don't want to deal with DCMI without DMA, especially at higher clock speeds (tens of MHz). STM32 HAL doesn't even allow you to do that. Haven't bothered to check if hardware supports it, since I'm totally not interested.

The thing to note here, is that DCMI interface deals with WORDS (i.e. 4 bytes) at a time. If image data is not dividable by 4, it gets padded with 0x00. 

MSP_Init section is reasonably straightforward, DCMI config struct has following members:

  • SynchroMode - use hardware (synchronization signals are sent using VSYNC/HSYNC pins) or embedded (sent in datastream itself);
  • PCKPolarity determines whether data sampling on D[x] is done on rising or falling edge. Must be the same as camera provides;
  • VSPolarity specifies whether VSYNC signal is active high or low;
  • HSPolarity specifies whether HSYNC signal is active high or low;
  • CaptureRate allows you to capture all/every 2 or every 4 frames. Can be useful, if you are dumping data on the screen for preview, where you don't need 120 FPS;
  • ExtendedDataMode specifies number of data signals you have. Normally odd number of bits (8/10/12/14)
  • JPEGMode  specifies the type of datastream. In JPEG mode image sizes are not fixed due to compression. I want to test with JPEG, since it allows for slower transfer rates.

Now just set up DCMI and DMA callbacks:

void DCMI_IRQHandler(void) {
	HAL_DCMI_IRQHandler(&hdcmi);
}
/* DCMI (CAMERA) interrupts */
void DMA2_Stream7_IRQHandler(void) {
	HAL_DMA_IRQHandler(&hdma_dcmi);
}

 and you're good to capture. Once you've done configuring your camera over SCCB, just enable DCMI interrupts and start capturing to DMA:

__HAL_DCMI_ENABLE_IT(&hdcmi, DCMI_IT_FRAME);
HAL_DCMI_Start_DMA(&hdcmi, DCMI_MODE_SNAPSHOT, (uint32_t) memory_location, img_size);

DCMI interface will wait for the beginning of the next frame (as indicated by VSYNC) and start dumping the data into memory_location until it reaches end of the frame (again, indicated by VSYNC). SNAPSHOT mode, unlike CONTINUOUS will stop capture after a single frame, while CONTINUOUS will continue to capture until DCMI is explicitly turned off. img_size is the buffer size large enough to hold the image. If you have defined your DMA config with a circular buffer, DMA will wrap around once img_size has been written to your memory_location.

Since JPEG size is inherently unknowable (due to compression), you'll have to find image start (0xFFD8) marker and image end (0xFFD9) markers in your memory location. Once you have found them, you can just dump the memory between those markers (including them) and save as .jpeg image.

And now I have 5MP camera actively capturing and dumping images. It's still buggy, but that seems to be a configuration issue. Have to figure out those binary blobs to configure the camer the way I need it to work.

4 comments:

  1. Hi !

    Thank you for the article, one of the very few on this topic.

    I am also working on the OV5640, would you be open for a chat to discuss my findings.

    Cheers,
    Anup

    ReplyDelete
  2. Hi.
    Sorry for comms delay, for some reason my domain is blacklisted at work and I couldn't respond in a timely manner.
    Sure we can discuss. But I would suggest having this discussion in some public forum, so that other people can read it as well and maybe get some useful info out of it. Otherwise I might have to summarize stuff in a separate blog post and might miss something.
    So I propose we have this discussion right here (drive up my blog rankings! yay!) :)

    ReplyDelete
    Replies
    1. Hi

      I am trying to put together a resource for OV2640 and OV5640 cameras on GitHub,sort of an unofficial guide while being open source. I wanted to do this in my spare time. The official docs do not do justice to the sensors or its applications. Most drivers seem to take an unfavorable route while documenting the workings and output.

      Also, I feel a bare generic driver with a step by step guide would help many get started instead of wandering around.

      I liked your blog and thought about asking you to contribute.

      Delete
    2. Damn log-in issues today.
      I had the same idea to publish documented driver on github, once it's in more or less usable state, but so far it's barely there.
      Sure, post link to your repo, I'll fork it if I have anything to add.

      Delete