...

Getting started with Jetson Nano | How to install Jetson Nano inference tutorial to classify objects

jetson nano setup

How to Install jetson-inference on Jetson Nano (Complete Build Tutorial) – Jetson Nano Setup

Jetson Nano Setup Overview

Setting up Jetson Nano correctly is the fastest way to start deploying AI models at the edge. In this guide you will install all required developer tools, clone NVIDIA’s jetson-inference repository, and compile it from source. You will also see how to install Python development headers and NumPy to enable clean bindings and examples. Finally, you will perform a tidy system registration with ldconfig, and optionally add Visual Studio Code for a smoother workflow. Let’s dive into the Jetson Nano setup process.

The link for the video : https://youtu.be/G7w6q0CHUPc

Setup and Build jetson-inference on Jetson Nano

Description :
Run these commands in order to update your Jetson Nano, install build tools, clone the repository with submodules, configure with CMake, compile, install, refresh library cache, and optionally install VS Code.

### Update package indexes so you get the latest versions of all packages. sudo apt-get update  ### Install Git for source control and CMake for generating build files. sudo apt-get install git cmake  ### Clone the jetson-inference repository that contains inference demos and utilities. git clone https://github.com/dusty-nv/jetson-inference  ### Enter the repository directory to work locally. cd jetson-inference  ### Initialize and update all nested submodules required by the project. git submodule update --init  ### Install Python development headers and NumPy for building Python bindings and running examples. sudo apt-get install libpython3-dev python3-numpy  ### Ensure you are in the project root before creating the out-of-source build directory. cd jetson-inference    # omit if working directory is already jetson-inference/ from above  ### Create a dedicated build directory to keep compiled files separate from source. mkdir build  ### Move into the build directory to configure the project. cd build  ### Generate project build files using CMake with defaults pointing to the source directory. cmake ../  ### Move into the build directory if you are not already there to compile the project. cd jetson-inference/build          # omit if working directory is already build/ from above  ### Compile the source code using all available cores by default on your system. make  ### Install the compiled binaries and libraries into system paths. sudo make install  ### Refresh the dynamic linker cache so new libraries are discoverable. sudo ldconfig  ### (Optional) Install Visual Studio Code on Jetson Nano for a better development experience. # Install vscode on Jetson Nano # https://github.com/JetsonHacksNano/installVSCode 

Your Jetson Nano is now prepared with Git, CMake, Python headers, NumPy, and a compiled jetson-inference build installed system-wide.
Running ldconfig ensures your libraries are correctly registered, and adding VS Code is optional but recommended for productivity.

You can find more similar tutorials in my blog posts page here : https://eranfeit.net/blog/

You can find more Nvidia Jetson Nano tutorials here : https://eranfeit.net/how-to-classify-objects-using-jetson-nano-inference-and-opencv/


Connect :

☕ Buy me a coffee — https://ko-fi.com/eranfeit

🖥️ Email : feitgemel@gmail.com

🌐 https://eranfeit.net

🤝 Fiverr : https://www.fiverr.com/s/mB3Pbb

Enjoy,

Eran

error: Content is protected !!
Eran Feit