Scroll Top

Depth Anything V2 (small)

C++ (Windows, Linux, MacOS / CUDA and Metal accelerated) port of https://github.com/DepthAnything/Depth-Anything-V2.

Example Input & Outputs

Inputs Outputs
Input
Depth Input
Input
Depth Result

Demo Code

 1#include "blace_ai.h"
 2#include <opencv2/opencv.hpp>
 3
 4// include the models you want to use
 5#include "depth_anything_v2_v8_small_v3_ALL_export_version_v17.h"
 6
 7using namespace blace;
 8int main() {
 9  workload_management::BlaceWorld blace;
10
11  // load image into op
12  auto exe_path = util::getPathToExe();
13  std::filesystem::path photo_path = exe_path / "butterfly.jpg";
14  auto world_tensor_orig =
15      CONSTRUCT_OP(ops::FromImageFileOp(photo_path.string()));
16
17  // interpolate to size consumable by model
18  auto interpolated = CONSTRUCT_OP(ops::Interpolate2DOp(
19      world_tensor_orig, 700, 1288, ml_core::BICUBIC, false, true));
20
21  // construct model inference arguments
22  ml_core::InferenceArgsCollection infer_args;
23  infer_args.inference_args.backends = {
24      ml_core::TORCHSCRIPT_CUDA_FP16, ml_core::TORCHSCRIPT_MPS_FP16,
25      ml_core::TORCHSCRIPT_CUDA_FP32, ml_core::TORCHSCRIPT_MPS_FP32,
26      ml_core::ONNX_DML_FP32,         ml_core::TORCHSCRIPT_CPU_FP32};
27
28  // construct inference operation
29  auto infer_op = depth_anything_v2_v8_small_v3_ALL_export_version_v17_run(
30      interpolated, 0, infer_args, util::getPathToExe().string());
31
32  // normalize depth to zero-one range
33  auto result_depth = CONSTRUCT_OP(ops::NormalizeToZeroOneOP(infer_op));
34
35  // construct evaluator and evaluate to cv::Mat
36  computation_graph::GraphEvaluator evaluator(result_depth);
37  auto cv_result = evaluator.evaluateToCVMat().value();
38
39  // multiply for plotting
40  cv_result *= 255.;
41
42  // save to disk and return
43  auto out_file = exe_path / "depth_result.png";
44  cv::imwrite(out_file.string(), cv_result);
45
46  return 0;
47}

Follow the 5 minute instructions to build and run the demo. Tested on version 0.9.62 of blace.ai sdk. Might also work on newer or older releases (check if release notes of blace.ai state breaking changes).

Supported Backends

Torchscript CPU Torchscript CUDA FP16 Torchscript CUDA FP32 Torchscript MPS FP16 Torchscript MPS FP32 ONNX CPU FP32 ONNX DirectML FP32

Artifacts

Torchscript Payload ONNX Payload Demo Project Header

License