xnnpack

Versions: [2021.02.22, 2022.02.16]

xnnpack

Download        : docker pull ghcr.io/autamus/xnnpack
Compressed Size : 27MB

Description

High-efficiency floating-point neural network inference operators for mobile, server, and Web

Usage

Pull (Download)

To download the latest version of xnnpack run,

docker pull ghcr.io/autamus/xnnpack:latest

or to download a specific version of xnnpack run,

docker pull ghcr.io/autamus/xnnpack:2021.02.22

Run

To run the container as an application run,

docker run --rm ghcr.io/autamus/xnnpack xnnpack --version

or to run the container in an interactive session run,

docker run -it --rm ghcr.io/autamus/xnnpack bash

Mounting volumes between the container and your machine

To access files from your machine within the xnnpack container you’ll have to mount them using the -v external/path:internal/path option.

For example,

docker run -v ~/Documents/Data:/Data ghcr.io/autamus/xnnpack xnnpack /Data/myData.csv

which will mount the ~/Documents/Data directory on your computer to the /Data directory within the container.

HPC

If you’re looking to use this container in an HPC environment we recommend using Singularity-HPC to use the container just as any other module on the cluster. Check out the SHPC xnnpack container here.