Quantcast
Channel: TAO Toolkit - NVIDIA Developer Forums
Viewing all articles
Browse latest Browse all 537

Nvidia Tao Deploy install is failing when trying to install on container from Orin Jetson AGX nvcr.io/nvidia/l4t-tensorrt:r8.6.2-devel

$
0
0

Hi, I’m trying to install nvidia-tao-deploy on my device and following the provided instructions results in failure.
Here’s the link of instructions i have followed:

To reproduce the issue:

  1. Run this command sudo docker run -it --rm --net=host --runtime nvidia -e DISPLAY=$DISPLAY -v /tmp/.X11-unix/:/tmp/.X11-unix nvcr.io/nvidia/l4t-tensorrt:r8.6.2-devel
  2. inside the container i try running pip install nvidia-tao-deploybut it fails with
    `Collecting nvidia-tao-deploy
    Downloading nvidia_tao_deploy-4.0.0.1-py3-none-any.whl (2.5 MB)
    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.5/2.5 MB 13.2 MB/s eta 0:00:00
    Collecting onnx
    Downloading onnx-1.16.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (15.8 MB)
    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 15.8/15.8 MB 67.4 MB/s eta 0:00:00
    Collecting opencv-python
    Downloading opencv_python-4.10.0.84-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (41.7 MB)
    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 41.7/41.7 MB 44.9 MB/s eta 0:00:00
    Collecting matplotlib>=3.0.3
    Downloading matplotlib-3.9.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (8.2 MB)
    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 8.2/8.2 MB 57.9 MB/s eta 0:00:00
    Collecting protobuf==3.20.1
    Downloading protobuf-3.20.1-cp310-cp310-manylinux2014_aarch64.whl (917 kB)
    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 917.9/917.9 KB 38.9 MB/s eta 0:00:00
    Collecting hydra-core==1.2.0
    Downloading hydra_core-1.2.0-py3-none-any.whl (151 kB)
    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 151.1/151.1 KB 23.2 MB/s eta 0:00:00
    Collecting scikit-learn==0.24.2
    Downloading scikit-learn-0.24.2.tar.gz (7.5 MB)
    ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7.5/7.5 MB 56.9 MB/s eta 0:00:00
    Installing build dependencies: started
    Installing build dependencies: finished with status ‘done’
    Getting requirements to build wheel: started
    Getting requirements to build wheel: finished with status ‘done’
    Preparing metadata (pyproject.toml): started
    Preparing metadata (pyproject.toml): still running…
    Preparing metadata (pyproject.toml): finished with status ‘error’
    error: subprocess-exited-with-error

× Preparing metadata (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [1596 lines of output]
Partial import of sklearn during the build process.
/usr/lib/python3.10/importlib/init.py:126: UserWarning: A NumPy version >=1.23.5 and <2.3.0 is required
log.txt (193.3 KB)

  The above exception was the direct cause of the following exception:
  
  Traceback (most recent call last):
    File "/usr/lib/python3/dist-packages/pip/_vendor/pep517/in_process/_in_process.py", line 363, in <module>
  [ 1/53] Cythonizing sklearn/__check_build/_check_build.pyx
  [ 2/53] Cythonizing sklearn/_isotonic.pyx
  [ 3/53] Cythonizing sklearn/cluster/_dbscan_inner.pyx
  [ 4/53] Cythonizing sklearn/cluster/_hierarchical_fast.pyx
  [ 5/53] Cythonizing sklearn/cluster/_k_means_elkan.pyx
  [ 6/53] Cythonizing sklearn/cluster/_k_means_fast.pyx
  [ 7/53] Cythonizing sklearn/cluster/_k_means_lloyd.pyx
  [ 8/53] Cythonizing sklearn/datasets/_svmlight_format_fast.pyx
  [ 9/53] Cythonizing sklearn/decomposition/_cdnmf_fast.pyx
  [10/53] Cythonizing sklearn/decomposition/_online_lda_fast.pyx
  [11/53] Cythonizing sklearn/ensemble/_gradient_boosting.pyx
  [12/53] Cythonizing sklearn/ensemble/_hist_gradient_boosting/_binning.pyx
  [13/53] Cythonizing sklearn/ensemble/_hist_gradient_boosting/_bitset.pyx
  [14/53] Cythonizing sklearn/ensemble/_hist_gradient_boosting/_gradient_boosting.pyx
  [15/53] Cythonizing sklearn/ensemble/_hist_gradient_boosting/_loss.pyx
  [16/53] Cythonizing sklearn/ensemble/_hist_gradient_boosting/_predictor.pyx
  [17/53] Cythonizing sklearn/ensemble/_hist_gradient_boosting/common.pyx
  [18/53] Cythonizing sklearn/ensemble/_hist_gradient_boosting/histogram.pyx
  [19/53] Cythonizing sklearn/ensemble/_hist_gradient_boosting/splitting.pyx
  [20/53] Cythonizing sklearn/ensemble/_hist_gradient_boosting/utils.pyx
  [21/53] Cythonizing sklearn/feature_extraction/_hashing_fast.pyx
  [22/53] Cythonizing sklearn/linear_model/_cd_fast.pyx
  [23/53] Cythonizing sklearn/linear_model/_sag_fast.pyx
  [24/53] Cythonizing sklearn/linear_model/_sgd_fast.pyx
  [25/53] Cythonizing sklearn/manifold/_barnes_hut_tsne.pyx
  [26/53] Cythonizing sklearn/manifold/_utils.pyx
  [27/53] Cythonizing sklearn/metrics/_pairwise_fast.pyx
  [28/53] Cythonizing sklearn/metrics/cluster/_expected_mutual_info_fast.pyx
  [29/53] Cythonizing sklearn/neighbors/_ball_tree.pyx
  [30/53] Cythonizing sklearn/neighbors/_dist_metrics.pyx
  [31/53] Cythonizing sklearn/neighbors/_kd_tree.pyx
  [32/53] Cythonizing sklearn/neighbors/_quad_tree.pyx
  [33/53] Cythonizing sklearn/neighbors/_typedefs.pyx
  [34/53] Cythonizing sklearn/preprocessing/_csr_polynomial_expansion.pyx
  [35/53] Cythonizing sklearn/svm/_liblinear.pyx
  [36/53] Cythonizing sklearn/svm/_libsvm.pyx
  [37/53] Cythonizing sklearn/svm/_libsvm_sparse.pyx
  [38/53] Cythonizing sklearn/svm/_newrand.pyx
  [39/53] Cythonizing sklearn/tree/_criterion.pyx
  [40/53] Cythonizing sklearn/tree/_splitter.pyx
  [41/53] Cythonizing sklearn/tree/_tree.pyx
  [42/53] Cythonizing sklearn/tree/_utils.pyx
  [43/53] Cythonizing sklearn/utils/_cython_blas.pyx
  [44/53] Cythonizing sklearn/utils/_fast_dict.pyx
  [45/53] Cythonizing sklearn/utils/_logistic_sigmoid.pyx
  [46/53] Cythonizing sklearn/utils/_openmp_helpers.pyx
  [47/53] Cythonizing sklearn/utils/_random.pyx
  [48/53] Cythonizing sklearn/utils/_seq_dataset.pyx
  [49/53] Cythonizing sklearn/utils/_weight_vector.pyx
  [50/53] Cythonizing sklearn/utils/arrayfuncs.pyx
  [51/53] Cythonizing sklearn/utils/graph_shortest_path.pyx
  [52/53] Cythonizing sklearn/utils/murmurhash.pyx
  [53/53] Cythonizing sklearn/utils/sparsefuncs_fast.pyx
      main()
    File "/usr/lib/python3/dist-packages/pip/_vendor/pep517/in_process/_in_process.py", line 345, in main
      json_out['return_val'] = hook(**hook_input['kwargs'])
    File "/usr/lib/python3/dist-packages/pip/_vendor/pep517/in_process/_in_process.py", line 164, in prepare_metadata_for_build_wheel
      return hook(metadata_directory, config_settings)
    File "/usr/lib/python3/dist-packages/setuptools/build_meta.py", line 174, in prepare_metadata_for_build_wheel
      self.run_setup()
    File "/usr/lib/python3/dist-packages/setuptools/build_meta.py", line 267, in run_setup
      super(_BuildMetaLegacyBackend,
    File "/usr/lib/python3/dist-packages/setuptools/build_meta.py", line 158, in run_setup
      exec(compile(code, __file__, 'exec'), locals())
    File "setup.py", line 301, in <module>
      setup_package()
    File "setup.py", line 297, in setup_package
      setup(**metadata)
    File "/usr/lib/python3/dist-packages/numpy/distutils/core.py", line 135, in setup
      config = configuration()
    File "setup.py", line 188, in configuration
      config.add_subpackage('sklearn')
    File "/usr/lib/python3/dist-packages/numpy/distutils/misc_util.py", line 1014, in add_subpackage
      config_list = self.get_subpackage(subpackage_name, subpackage_path,
    File "/usr/lib/python3/dist-packages/numpy/distutils/misc_util.py", line 980, in get_subpackage
      config = self._get_configuration_from_setup_py(
    File "/usr/lib/python3/dist-packages/numpy/distutils/misc_util.py", line 922, in _get_configuration_from_setup_py
      config = setup_module.configuration(*args)
    File "/tmp/pip-install-q_62ve15/scikit-learn_43d84ea2c64e496c99f2389b041220e6/sklearn/setup.py", line 83, in configuration
      cythonize_extensions(top_path, config)
    File "/tmp/pip-install-q_62ve15/scikit-learn_43d84ea2c64e496c99f2389b041220e6/sklearn/_build_utils/__init__.py", line 70, in cythonize_extensions
      config.ext_modules = cythonize(
    File "/tmp/pip-build-env-sk18gr2r/overlay/local/lib/python3.10/dist-packages/Cython/Build/Dependencies.py", line 1145, in cythonize
      result.get(99999)  # seconds
    File "/usr/lib/python3.10/multiprocessing/pool.py", line 774, in get
      raise self._value
  Cython.Compiler.Errors.CompileError: sklearn/ensemble/_hist_gradient_boosting/splitting.pyx
  [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

× Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details.
`
I’ve attached the full log.txt file for this issue.
What are the next steps to make this work? I need tao-deploy to be able to quantize, calibrate, and build an engine file for an existing model

• Hardware (Model: NVIDIA Jetson AGX Orin Developer Kit - Jetpack 6.0 [L4T 36.3.0])
• Network Type (Classification)
• TRT Version (nvcr.io/nvidia/l4t-tensorrt:r8.6.2-devel)

2 posts - 2 participants

Read full topic


Viewing all articles
Browse latest Browse all 537

Trending Articles