Introduction: Unboxing of Jetson Nano & a Quick Start-Up for Two Vision Demo

About: Howdy, we are application engineers in Seeed. Sharing projects with the maker community is awesome. Hope you like it XD Seeed is the IoT hardware enabler providing services that empower IoT developers to swift…

Summarize

As you know, Jetson Nano is now a star product. And it can extensively deploy neural network technology to embedded systems. Here is an unboxing article of details of the product, the process to start-up, and two visual demos…

Word count:800 words & 2 videos

Reading time: 20 minutes

Audience:

  • Developers who are interested in AI but do not have a solid background
  • Developers who haven’t decided whether to buy it or not
  • Developers who bought it but haven’t got it yet

Buy NOW !

Step 1: 1. What Is Jetson Nano?

Just in case, let me start with a short introduction.

The NVIDIA® Jetson Nano™ Developer Kit delivers the compute performance to run modern AI workloads at the unprecedented size, power, and cost. Developers, learners, and makers can now run AI frameworks and models for applications like image classification, object detection, segmentation, and speech processing.
And you can find more information at the official page.

What can it do? You can simply concept it as a Raspberry Pi with more computation resource that can support large neural network for a significant number of applications.
For me, I have already prepared to build a classification network to identify my 6 stupid cats in the house and feed them automatically lol.

Step 2: 2. Unboxing


Step 3: 3.Start-Up

Preparations

You need to prepare:
  1. MicroSD Card of 16GB +

  2. USB keyboard and mouse

  3. A screen (HDMI or DP)

  4. Micro-USB (5V⎓4A) or Power Jack(5V⎓4A power supply. Accepts a 2.1×5.5×9.5 mm plug with
    positive polarity)

  5. A laptop that can connect to the Internet and burn microSD cards.

  6. An Ethernet line

Attentions:

  • Not all power supply rated 5V_2A can reach the rated power stably. And as far as I tested, the Jetson Nano is really sensitive to the power supply, and even minor power fluctuations can cause it to crash. You must purchase a power adapter of high quality.
  • Even USB devices should not be hot plugged, or the system of this board will crash for an unknown reason.
    • Enables either J28 Micro-USB connector or J25 power jack as a power source for the developer kit. Without a jumper, the developer kit can be powered by J28 MicroUSB connector. With a jumper, no power is drawn from J28, and the developer kit can be powered via J25 power jack.

  • No button of Reset, so every time it crashed, developers have to restart it by a manual breakpoint.
  • No built-in WiFi module
  • No Bluetooth Module
A terse tutorial

The steps of start-up for Jetson Nano is just the same as other arm-linux board and just in case, here is a brief tutorial. Read the Official Guide for more info.



  1. Download the system imaging here
  2. Burn it to your SD card. Here are many tools can complete this work. And Win32diskimager is recommended.
  3. Plug in the USB thumbdrive or SD or microSD card to your computer. It should be detected and appear as a drive in Windows.

  4. Open Win32 Disk Imager, choose the.img or image file you want to write as Image File and choose the USB or SD drive as Device and press Write.

  5. The writing process may take a while. Once it is done, remove the USB thumbdrive or SD card.

  6. Insert the microSD card (written with the system image) in the bottom of the Jetson Nano module.

  7. Power on and when the developer kit starts, the green LED light next to the Micro-USB connector will light up.

  8. When you first start up, the Jetson Nano Developer Suite will guide you through some initial settings, including selecting system language, keyboard layout, and such stuff.

  9. Finally, you will see this screen. Congratulations!

Step 4: 4.Demo

Follow the Official Guide to configure the environment and compiling the project. I have run 2 projects as imagine classification and face-detection as the demo. Now, the environment for vision and deep learning is totally configured, and I’ll work on my project lol.

Attentions:


  • Here are some questions with the start-up code for the camera and you need to configure by yourself to match your camera. For more specify:
    • line 80 of jetson-utils/camera/gstCamera.c for the frame size:
    • const uint32_t DefaultWidth  = 1280;
      
      static const uint32_t DefaultHeight = 720;  
  • line 37 of jetson-inference/imagenet-camera/imagenet-camera.cpp as well and other demo for the index of camera . And in some codes, the default index is not defined by macros(e.g., gstCamera.h), you may have to modify them manually when you meet problem opening the camera.

    • #define DEFAULT_CAMERA -1
  • In some codes, the default index of the camera is not defined by macros, and you may have to modify them manually.
    you can use the command V4L2-ctl in the terminal to get the index and size of the frame for your camera.

    V4L2-ctl --device=$d -D --list-formats
    

Thank you for reading & watching.