{"id":53914,"date":"2025-09-21T16:55:42","date_gmt":"2025-09-21T06:55:42","guid":{"rendered":"https:\/\/www.cloudproinc.com.au\/?p=53914"},"modified":"2025-09-21T16:55:44","modified_gmt":"2025-09-21T06:55:44","slug":"run-pytorch-in-net-with-torchsharp","status":"publish","type":"post","link":"https:\/\/cloudproinc.azurewebsites.net\/index.php\/2025\/09\/21\/run-pytorch-in-net-with-torchsharp\/","title":{"rendered":"Run PyTorch in .NET with TorchSharp"},"content":{"rendered":"\n<p>In this blog post Practical ways to run PyTorch in .NET with TorchSharp and more we will walk through reliable ways to use PyTorch from .NET, when to choose each approach, and how the pieces work under the hood.<\/p>\n\n\n\n<!--more-->\n\n\n\n<p>At a high level, you have three good options: write and run models directly in .NET with TorchSharp; train in Python and deploy in .NET via ONNX Runtime; or keep Python for inference behind a service boundary and call it from .NET. Each route can be production-grade with the right packaging and testing.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"h-what-s-happening-under-the-hood\">What\u2019s happening under the hood<\/h2>\n\n\n\n<p><a href=\"https:\/\/www.cloudproinc.com.au\/index.php\/category\/pytorch\/\">PyTorch<\/a> is a tensor library and autograd system with a rich operator set, a module system (torch.nn), and multiple backends (CPU, CUDA). The C++ core of PyTorch is called <em>LibTorch<\/em> and exposes high-performance kernels and the JIT runtime.<\/p>\n\n\n\n<p>TorchSharp is a .NET binding over LibTorch. It provides C# and F# APIs for tensors, autograd, and nn modules, and calls into the same native kernels that Python PyTorch uses. That means it\u2019s fast, supports CPU or GPU, and is deployable as a pure .NET application with native dependencies.<\/p>\n\n\n\n<p>ONNX is an open model format. You can export many PyTorch models to ONNX in Python, then load and run them in .NET with Microsoft\u2019s ONNX Runtime. This is excellent for inference, especially when you want a minimal runtime without shipping the whole PyTorch stack.<\/p>\n\n\n\n<p>Finally, a service boundary (REST\/gRPC) lets you keep Python in production for inference while .NET owns the app and business logic. This is often the quickest bridge when you have existing Python models or teams.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"h-three-production-patterns-to-choose-from\">Three production patterns to choose from<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"h-1-torchsharp-for-end-to-end-net\">1) TorchSharp for end-to-end .NET<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Pros: Single tech stack, full control, no Python in prod, great performance.<\/li>\n\n\n\n<li>Cons: API surface isn\u2019t identical to Python; you\u2019ll port training code to C#.<\/li>\n\n\n\n<li>Best for: Teams committed to .NET who want training and inference in one runtime.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"h-2-pytorch-to-onnx-to-net\">2) PyTorch to ONNX to .NET<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Pros: Keep training in Python; lightweight, fast inference with ONNX Runtime.<\/li>\n\n\n\n<li>Cons: Some models\/operators don\u2019t export cleanly; no training, inference only.<\/li>\n\n\n\n<li>Best for: Inference at scale, simpler deployment, minimal native dependencies.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"h-3-python-service-with-a-net-client\">3) Python service with a .NET client<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Pros: Reuse existing Python code\/libs as-is; easy iteration.<\/li>\n\n\n\n<li>Cons: Two runtimes to operate; network hop; latency considerations.<\/li>\n\n\n\n<li>Best for: Fast time-to-value when models frequently change or are Python-heavy.<\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"h-getting-started-with-torchsharp-in-net\">Getting started with TorchSharp in .NET<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"h-install-packages\">Install packages<\/h3>\n\n\n\n<p>Add the TorchSharp NuGet package to your .NET project. For GPU acceleration, add the matching CUDA-enabled native runtime for your OS\/CUDA version as documented by TorchSharp. For CPU-only scenarios, use the CPU runtime (often brought in by default).<\/p>\n\n\n\n<pre class=\"wp-block-code has-white-color has-black-background-color has-text-color has-background has-link-color wp-elements-712858ff894893fc884decbe0c672648\"><code>dotnet add package TorchSharp<\/code><\/pre>\n\n\n\n<p>Note: TorchSharp ships native LibTorch binaries per OS\/arch. Align your package choice with your deployment target, and prefer 64-bit builds.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"h-a-minimal-torchsharp-example\">A minimal TorchSharp example<\/h3>\n\n\n\n<p>The sample below trains a tiny binary classifier in C#. It shows device selection (CPU\/GPU), model definition, a training loop, and inference.<\/p>\n\n\n\n<pre class=\"wp-block-code has-white-color has-black-background-color has-text-color has-background has-link-color wp-elements-d84107ed921ef3d8ce4034181c9af649\"><code>using System;\nusing System.Linq;\nusing TorchSharp;\nusing static TorchSharp.torch;\nusing static TorchSharp.torch.nn;\n\nclass Program\n{\n    static void Main()\n    {\n        \/\/ Select device\n        var device = cuda.is_available() ? CUDA : CPU;\n        Console.WriteLine($\"Using device: {device.type}\");\n\n        \/\/ Synthetic dataset: points in R^2 labeled by a linear boundary\n        var n = 4096;\n        var x = randn(new long&#91;] { n, 2 }, device: device);  \/\/ (n, 2)\n        var wTrue = tensor(new float&#91;] { 1f, 1f }, device: device).unsqueeze(1); \/\/ (2,1)\n        var logitsTrue = matmul(x, wTrue); \/\/ (n,1)\n        var y = logitsTrue.gt(0f).to_type(ScalarType.Float32); \/\/ (n,1), 0\/1\n\n        \/\/ Model: 2 -&gt; 16 -&gt; 1 with ReLU, train with BCEWithLogits\n        var model = Sequential(\n            (\"fc1\", Linear(2, 16)),\n            (\"relu1\", ReLU()),\n            (\"fc2\", Linear(16, 1))\n        ).to(device);\n\n        var lossFn = BCEWithLogitsLoss();\n        var optimizer = optim.Adam(model.parameters(), lr: 0.05);\n\n        var epochs = 50;\n        var batch = 256;\n\n        model.train();\n        for (int epoch = 1; epoch &lt;= epochs; epoch++)\n        {\n            var perm = randperm(n, device: device).to_type(ScalarType.Int64);\n            var totalLoss = 0.0;\n            for (int i = 0; i &lt; n; i += batch)\n            {\n                var idx = perm.slice(0, i, Math.Min(i + batch, n));\n                var xb = x.index_select(0, idx);\n                var yb = y.index_select(0, idx);\n\n                var logits = model.forward(xb);\n                var loss = lossFn.forward(logits, yb);\n\n                optimizer.zero_grad();\n                loss.backward();\n                optimizer.step();\n\n                totalLoss += loss.to_double();\n\n                \/\/ Dispose batch tensors to keep memory stable\n                xb.Dispose(); yb.Dispose(); logits.Dispose(); loss.Dispose();\n            }\n            Console.WriteLine($\"epoch {epoch}\/{epochs}  loss {totalLoss * batch \/ n:F4}\");\n        }\n\n        \/\/ Inference: compute predictions on a few samples\n        model.eval();\n        using var noGrad = no_grad();\n        var test = tensor(new float&#91;,] {{ 1f, 1f }, { -1f, -0.5f }, { 0.5f, -2f }}, device: device);\n        var pred = sigmoid(model.forward(test));\n        var probs = pred.cpu().to(DeviceType.CPU).data().ToArray();\n        Console.WriteLine(\"Predicted probabilities: \" + string.Join(\", \", probs.Select(p =&gt; p.ToString(\"F3\"))));\n\n        \/\/ Cleanup\n        test.Dispose(); pred.Dispose(); x.Dispose(); y.Dispose(); wTrue.Dispose(); logitsTrue.Dispose();\n        model.Dispose(); optimizer.Dispose(); lossFn.Dispose();\n    }\n}\n<\/code><\/pre>\n\n\n\n<p>What to notice:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Same concepts as Python PyTorch: tensors, modules, optimizers, autograd.<\/li>\n\n\n\n<li>Device placement mirrors PyTorch. If CUDA is available, GPU is used.<\/li>\n\n\n\n<li>Dispose intermediate tensors in tight loops to keep memory steady.<\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"h-train-in-python-run-in-net-with-onnx-runtime\">Train in Python, run in .NET with ONNX Runtime<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"h-export-to-onnx-in-python\">Export to ONNX in Python<\/h3>\n\n\n\n<pre class=\"wp-block-code has-white-color has-black-background-color has-text-color has-background has-link-color wp-elements-dccb948d520dd92a22f44b811d9b3171\"><code>import torch, torch.nn as nn\n\nclass M(nn.Module):\n    def __init__(self):\n        super().__init__()\n        self.net = nn.Sequential(\n            nn.Linear(2, 16), nn.ReLU(), nn.Linear(16, 1)\n        )\n    def forward(self, x):\n        return self.net(x)\n\nmodel = M().eval()\ndummy = torch.randn(1, 2)\n\ntorch.onnx.export(\n    model, dummy, \"model.onnx\",\n    input_names=&#91;\"input\"], output_names=&#91;\"logits\"],\n    dynamic_axes={\"input\": {0: \"batch\"}, \"logits\": {0: \"batch\"}},\n    opset_version=17\n)\n<\/code><\/pre>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"h-run-the-onnx-model-in-net\">Run the ONNX model in .NET<\/h3>\n\n\n\n<pre class=\"wp-block-code has-white-color has-black-background-color has-text-color has-background has-link-color wp-elements-55929498b7db98de709a8fa123daf55c\"><code>dotnet add package Microsoft.ML.OnnxRuntime<\/code><\/pre>\n\n\n\n<pre class=\"wp-block-code has-white-color has-black-background-color has-text-color has-background has-link-color wp-elements-d91ba7711f934d637bba73d6bbf591c1\"><code>using Microsoft.ML.OnnxRuntime;\nusing Microsoft.ML.OnnxRuntime.Tensors;\n\nusing var session = new InferenceSession(\"model.onnx\");\n\n\/\/ Create a 3x2 input (batch=3)\nvar input = new DenseTensor&lt;float&gt;(new&#91;] { 3, 2 });\nvar data = new float&#91;] { 1f,1f, -1f,-0.5f, 0.5f,-2f };\nfor (int i = 0; i &lt; data.Length; i++) input.Buffer.Span&#91;i] = data&#91;i];\n\nvar inputs = new List&lt;NamedOnnxValue&gt; {\n    NamedOnnxValue.CreateFromTensor(session.InputMetadata.Keys.First(), input)\n};\nusing var results = session.Run(inputs);\n\nvar output = results.First().AsEnumerable&lt;float&gt;().ToArray();\nConsole.WriteLine(\"Logits: \" + string.Join(\", \", output.Select(v =&gt; v.ToString(\"F3\"))));\n<\/code><\/pre>\n\n\n\n<p>Tip: For GPU inference with ONNX Runtime, use the appropriate GPU-enabled package and ensure CUDA\/cuDNN drivers are present on the host image.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"h-keep-python-call-it-from-net\">Keep Python, call it from .NET<\/h2>\n\n\n\n<p>When you already have stable Python inference code, wrap it behind a small HTTP or gRPC service. FastAPI makes this easy.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"h-minimal-fastapi-wrapper-python\">Minimal FastAPI wrapper (Python)<\/h3>\n\n\n\n<pre class=\"wp-block-code has-white-color has-black-background-color has-text-color has-background has-link-color wp-elements-c041ff87586d758c76256081d7c9d298\"><code>from fastapi import FastAPI\nimport torch\n\napp = FastAPI()\nmodel = torch.jit.load(\"model.pt\").eval()  # or load a PyTorch nn.Module\n\n@app.post(\"\/predict\")\ndef predict(payload: dict):\n    x = torch.tensor(payload&#91;\"data\"]).float()\n    with torch.no_grad():\n        y = model(x).tolist()\n    return {\"pred\": y}\n<\/code><\/pre>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"h-net-client-call\">.NET client call<\/h3>\n\n\n\n<pre class=\"wp-block-code has-white-color has-black-background-color has-text-color has-background has-link-color wp-elements-6fb0233329bdea941a5bab4cad28ba7d\"><code>using System.Net.Http;\nusing System.Text;\nusing System.Text.Json;\n\nvar http = new HttpClient();\nvar payload = JsonSerializer.Serialize(new { data = new float&#91;]&#91;] {\n    new float&#91;]{1f,1f}, new float&#91;]{-1f,-0.5f}\n}});\nvar resp = await http.PostAsync(\n    \"http:\/\/ml-service\/predict\",\n    new StringContent(payload, Encoding.UTF8, \"application\/json\"));\nresp.EnsureSuccessStatusCode();\nvar body = await resp.Content.ReadAsStringAsync();\nConsole.WriteLine(body);\n<\/code><\/pre>\n\n\n\n<p>Keep requests small, batch where possible, and consider gRPC for low-latency, high-throughput scenarios.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"h-how-to-choose\">How to choose<\/h2>\n\n\n\n<ul class=\"wp-block-list\">\n<li>If you want a single runtime and full control, pick TorchSharp.<\/li>\n\n\n\n<li>If you want Python for training and a slim, fast inference in .NET, use ONNX Runtime.<\/li>\n\n\n\n<li>If you want to move fast with existing Python code, expose a service.<\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"h-performance-and-deployment-tips\">Performance and deployment tips<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"h-torchsharp\">TorchSharp<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Package native LibTorch with your app. Choose the CPU or CUDA runtime matching your OS\/arch.<\/li>\n\n\n\n<li>For containers, start from an image that includes the right CUDA drivers if using GPU.<\/li>\n\n\n\n<li>Use batches and disable grads for inference (no_grad()). Warm up the model before first request.<\/li>\n\n\n\n<li>Dispose temporary tensors in loops to avoid memory growth.<\/li>\n\n\n\n<li>Align TorchSharp version with its documented LibTorch version to avoid ABI mismatches.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"h-onnx-runtime\">ONNX Runtime<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Validate the ONNX export as part of CI. Mismatched opsets cause runtime errors.<\/li>\n\n\n\n<li>Use dynamic axes in export if your batch sizes vary.<\/li>\n\n\n\n<li>Normalize inputs in .NET the same way as in Python. Shape and layout (NCHW vs NHWC) must match.<\/li>\n\n\n\n<li>Consider the Execution Providers you need (CPU vs CUDA vs DirectML) based on your hardware.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"h-python-service\">Python service<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Set concurrency with uvicorn\/gunicorn workers tuned to your hardware.<\/li>\n\n\n\n<li>Batch requests server-side to maximize GPU utilization.<\/li>\n\n\n\n<li>Version your model artifact and expose a health\/metadata endpoint.<\/li>\n\n\n\n<li>Secure the service (auth, TLS) and rate-limit externally.<\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"h-common-pitfalls-to-avoid\">Common pitfalls to avoid<\/h2>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Data type mismatches: INT64 vs INT32 indices, float32 vs float64 tensors.<\/li>\n\n\n\n<li>Silent shape errors: log and assert shapes at boundaries; add unit tests.<\/li>\n\n\n\n<li>Export gaps: some custom ops or control flow don\u2019t export well to ONNX. Use TorchScript or service boundary if needed.<\/li>\n\n\n\n<li>Driver\/ABI mismatches: keep CUDA and LibTorch versions aligned across build and deploy.<\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"h-wrapping-up\">Wrapping up<\/h2>\n\n\n\n<p>You have solid, well-supported paths to run PyTorch with .NET. TorchSharp gives you native training and inference in a single stack. ONNX Runtime delivers lightweight, fast inference for many exported models. A Python service is a pragmatic bridge when you need full PyTorch flexibility right now.<\/p>\n\n\n\n<p>Pick the pattern that fits your team and deployment constraints, automate validation in CI, and standardize packaging for your target environments. With that foundation, bringing ML to your .NET applications becomes straightforward and maintainable.<\/p>\n\n\n\n<ul class=\"wp-block-yoast-seo-related-links yoast-seo-related-links\">\n<li><a href=\"https:\/\/www.cloudproinc.com.au\/index.php\/2024\/07\/29\/recover-deleted-or-lost-exchange-online-emails-to-pst\/\">Recover Deleted or Lost Exchange Online Emails to PST<\/a><\/li>\n\n\n\n<li><a href=\"https:\/\/www.cloudproinc.com.au\/index.php\/2025\/09\/15\/build-lean-reliable-net-docker-images-for-production\/\">Build Lean Reliable .NET Docker Images for Production<\/a><\/li>\n\n\n\n<li><a href=\"https:\/\/www.cloudproinc.com.au\/index.php\/2025\/09\/18\/the-autonomy-of-tensors\/\">The Autonomy of Tensors<\/a><\/li>\n\n\n\n<li><a href=\"https:\/\/www.cloudproinc.com.au\/index.php\/2025\/07\/21\/running-pytorch-in-microsoft-azure-machine-learning\/\">Running PyTorch in Microsoft Azure Machine Learning<\/a><\/li>\n\n\n\n<li><a href=\"https:\/\/www.cloudproinc.com.au\/index.php\/2025\/09\/14\/mastering-docker-environment-variables-with-docker\/\">Mastering Docker environment variables with Docker<\/a><\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"<p>Build and ship PyTorch models in .NET using TorchSharp, ONNX, or a Python service. Practical steps, code, and deployment tips for teams on Windows, Linux, and containers.<\/p>\n","protected":false},"author":1,"featured_media":53915,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_yoast_wpseo_focuskw":"Run PyTorch in .NET with TorchSharp","_yoast_wpseo_title":"","_yoast_wpseo_metadesc":"Learn how to run PyTorch in .NET with TorchSharp effectively and choose the best approach for your ML projects.","_yoast_wpseo_opengraph-title":"","_yoast_wpseo_opengraph-description":"","_yoast_wpseo_twitter-title":"","_yoast_wpseo_twitter-description":"","_et_pb_use_builder":"","_et_pb_old_content":"","_et_gb_content_width":"","_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[27,13,75],"tags":[],"class_list":["post-53914","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-net","category-blog","category-pytorch"],"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v27.3 (Yoast SEO v27.3) - https:\/\/yoast.com\/product\/yoast-seo-premium-wordpress\/ -->\n<title>Run PyTorch in .NET with TorchSharp - CPI Consulting<\/title>\n<meta name=\"description\" content=\"Learn how to run PyTorch in .NET with TorchSharp effectively and choose the best approach for your ML projects.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.cloudproinc.com.au\/index.php\/2025\/09\/21\/run-pytorch-in-net-with-torchsharp\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Run PyTorch in .NET with TorchSharp\" \/>\n<meta property=\"og:description\" content=\"Learn how to run PyTorch in .NET with TorchSharp effectively and choose the best approach for your ML projects.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.cloudproinc.com.au\/index.php\/2025\/09\/21\/run-pytorch-in-net-with-torchsharp\/\" \/>\n<meta property=\"og:site_name\" content=\"CPI Consulting\" \/>\n<meta property=\"article:published_time\" content=\"2025-09-21T06:55:42+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-09-21T06:55:44+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/cloudproinc.azurewebsites.net\/wp-content\/uploads\/2025\/09\/practical-ways-to-run-pytorch-in-net-with-torchsharp-and-more.png\" \/>\n\t<meta property=\"og:image:width\" content=\"1536\" \/>\n\t<meta property=\"og:image:height\" content=\"1024\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"CPI Staff\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"CPI Staff\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"5 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/www.cloudproinc.com.au\\\/index.php\\\/2025\\\/09\\\/21\\\/run-pytorch-in-net-with-torchsharp\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/www.cloudproinc.com.au\\\/index.php\\\/2025\\\/09\\\/21\\\/run-pytorch-in-net-with-torchsharp\\\/\"},\"author\":{\"name\":\"CPI Staff\",\"@id\":\"https:\\\/\\\/cloudproinc.azurewebsites.net\\\/#\\\/schema\\\/person\\\/192eeeb0ce91062126ce3822ae88fe6e\"},\"headline\":\"Run PyTorch in .NET with TorchSharp\",\"datePublished\":\"2025-09-21T06:55:42+00:00\",\"dateModified\":\"2025-09-21T06:55:44+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/www.cloudproinc.com.au\\\/index.php\\\/2025\\\/09\\\/21\\\/run-pytorch-in-net-with-torchsharp\\\/\"},\"wordCount\":1005,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\\\/\\\/cloudproinc.azurewebsites.net\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/www.cloudproinc.com.au\\\/index.php\\\/2025\\\/09\\\/21\\\/run-pytorch-in-net-with-torchsharp\\\/#primaryimage\"},\"thumbnailUrl\":\"\\\/wp-content\\\/uploads\\\/2025\\\/09\\\/practical-ways-to-run-pytorch-in-net-with-torchsharp-and-more.png\",\"articleSection\":[\".NET\",\"Blog\",\"PyTorch\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\\\/\\\/www.cloudproinc.com.au\\\/index.php\\\/2025\\\/09\\\/21\\\/run-pytorch-in-net-with-torchsharp\\\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/www.cloudproinc.com.au\\\/index.php\\\/2025\\\/09\\\/21\\\/run-pytorch-in-net-with-torchsharp\\\/\",\"url\":\"https:\\\/\\\/www.cloudproinc.com.au\\\/index.php\\\/2025\\\/09\\\/21\\\/run-pytorch-in-net-with-torchsharp\\\/\",\"name\":\"Run PyTorch in .NET with TorchSharp - CPI Consulting\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/cloudproinc.azurewebsites.net\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/www.cloudproinc.com.au\\\/index.php\\\/2025\\\/09\\\/21\\\/run-pytorch-in-net-with-torchsharp\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/www.cloudproinc.com.au\\\/index.php\\\/2025\\\/09\\\/21\\\/run-pytorch-in-net-with-torchsharp\\\/#primaryimage\"},\"thumbnailUrl\":\"\\\/wp-content\\\/uploads\\\/2025\\\/09\\\/practical-ways-to-run-pytorch-in-net-with-torchsharp-and-more.png\",\"datePublished\":\"2025-09-21T06:55:42+00:00\",\"dateModified\":\"2025-09-21T06:55:44+00:00\",\"description\":\"Learn how to run PyTorch in .NET with TorchSharp effectively and choose the best approach for your ML projects.\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/www.cloudproinc.com.au\\\/index.php\\\/2025\\\/09\\\/21\\\/run-pytorch-in-net-with-torchsharp\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/www.cloudproinc.com.au\\\/index.php\\\/2025\\\/09\\\/21\\\/run-pytorch-in-net-with-torchsharp\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/www.cloudproinc.com.au\\\/index.php\\\/2025\\\/09\\\/21\\\/run-pytorch-in-net-with-torchsharp\\\/#primaryimage\",\"url\":\"\\\/wp-content\\\/uploads\\\/2025\\\/09\\\/practical-ways-to-run-pytorch-in-net-with-torchsharp-and-more.png\",\"contentUrl\":\"\\\/wp-content\\\/uploads\\\/2025\\\/09\\\/practical-ways-to-run-pytorch-in-net-with-torchsharp-and-more.png\",\"width\":1536,\"height\":1024},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/www.cloudproinc.com.au\\\/index.php\\\/2025\\\/09\\\/21\\\/run-pytorch-in-net-with-torchsharp\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/cloudproinc.azurewebsites.net\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Run PyTorch in .NET with TorchSharp\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/cloudproinc.azurewebsites.net\\\/#website\",\"url\":\"https:\\\/\\\/cloudproinc.azurewebsites.net\\\/\",\"name\":\"Cloud Pro Inc - CPI Consulting Pty Ltd\",\"description\":\"Cloud, AI &amp; Cybersecurity Consulting | Melbourne\",\"publisher\":{\"@id\":\"https:\\\/\\\/cloudproinc.azurewebsites.net\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/cloudproinc.azurewebsites.net\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/cloudproinc.azurewebsites.net\\\/#organization\",\"name\":\"Cloud Pro Inc - Cloud Pro Inc - CPI Consulting Pty Ltd\",\"url\":\"https:\\\/\\\/cloudproinc.azurewebsites.net\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/cloudproinc.azurewebsites.net\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"\\\/wp-content\\\/uploads\\\/2022\\\/01\\\/favfinalfile.png\",\"contentUrl\":\"\\\/wp-content\\\/uploads\\\/2022\\\/01\\\/favfinalfile.png\",\"width\":500,\"height\":500,\"caption\":\"Cloud Pro Inc - Cloud Pro Inc - CPI Consulting Pty Ltd\"},\"image\":{\"@id\":\"https:\\\/\\\/cloudproinc.azurewebsites.net\\\/#\\\/schema\\\/logo\\\/image\\\/\"}},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/cloudproinc.azurewebsites.net\\\/#\\\/schema\\\/person\\\/192eeeb0ce91062126ce3822ae88fe6e\",\"name\":\"CPI Staff\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/2d96eeb53b791d92c8c50dd667e3beec92c93253bb6ff21c02cfa8ca73665c70?s=96&d=mm&r=g\",\"url\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/2d96eeb53b791d92c8c50dd667e3beec92c93253bb6ff21c02cfa8ca73665c70?s=96&d=mm&r=g\",\"contentUrl\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/2d96eeb53b791d92c8c50dd667e3beec92c93253bb6ff21c02cfa8ca73665c70?s=96&d=mm&r=g\",\"caption\":\"CPI Staff\"},\"sameAs\":[\"http:\\\/\\\/www.cloudproinc.com.au\"],\"url\":\"https:\\\/\\\/cloudproinc.azurewebsites.net\\\/index.php\\\/author\\\/cpiadmin\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"Run PyTorch in .NET with TorchSharp - CPI Consulting","description":"Learn how to run PyTorch in .NET with TorchSharp effectively and choose the best approach for your ML projects.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.cloudproinc.com.au\/index.php\/2025\/09\/21\/run-pytorch-in-net-with-torchsharp\/","og_locale":"en_US","og_type":"article","og_title":"Run PyTorch in .NET with TorchSharp","og_description":"Learn how to run PyTorch in .NET with TorchSharp effectively and choose the best approach for your ML projects.","og_url":"https:\/\/www.cloudproinc.com.au\/index.php\/2025\/09\/21\/run-pytorch-in-net-with-torchsharp\/","og_site_name":"CPI Consulting","article_published_time":"2025-09-21T06:55:42+00:00","article_modified_time":"2025-09-21T06:55:44+00:00","og_image":[{"width":1536,"height":1024,"url":"https:\/\/cloudproinc.azurewebsites.net\/wp-content\/uploads\/2025\/09\/practical-ways-to-run-pytorch-in-net-with-torchsharp-and-more.png","type":"image\/png"}],"author":"CPI Staff","twitter_card":"summary_large_image","twitter_misc":{"Written by":"CPI Staff","Est. reading time":"5 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.cloudproinc.com.au\/index.php\/2025\/09\/21\/run-pytorch-in-net-with-torchsharp\/#article","isPartOf":{"@id":"https:\/\/www.cloudproinc.com.au\/index.php\/2025\/09\/21\/run-pytorch-in-net-with-torchsharp\/"},"author":{"name":"CPI Staff","@id":"https:\/\/cloudproinc.azurewebsites.net\/#\/schema\/person\/192eeeb0ce91062126ce3822ae88fe6e"},"headline":"Run PyTorch in .NET with TorchSharp","datePublished":"2025-09-21T06:55:42+00:00","dateModified":"2025-09-21T06:55:44+00:00","mainEntityOfPage":{"@id":"https:\/\/www.cloudproinc.com.au\/index.php\/2025\/09\/21\/run-pytorch-in-net-with-torchsharp\/"},"wordCount":1005,"commentCount":0,"publisher":{"@id":"https:\/\/cloudproinc.azurewebsites.net\/#organization"},"image":{"@id":"https:\/\/www.cloudproinc.com.au\/index.php\/2025\/09\/21\/run-pytorch-in-net-with-torchsharp\/#primaryimage"},"thumbnailUrl":"\/wp-content\/uploads\/2025\/09\/practical-ways-to-run-pytorch-in-net-with-torchsharp-and-more.png","articleSection":[".NET","Blog","PyTorch"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/www.cloudproinc.com.au\/index.php\/2025\/09\/21\/run-pytorch-in-net-with-torchsharp\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/www.cloudproinc.com.au\/index.php\/2025\/09\/21\/run-pytorch-in-net-with-torchsharp\/","url":"https:\/\/www.cloudproinc.com.au\/index.php\/2025\/09\/21\/run-pytorch-in-net-with-torchsharp\/","name":"Run PyTorch in .NET with TorchSharp - CPI Consulting","isPartOf":{"@id":"https:\/\/cloudproinc.azurewebsites.net\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.cloudproinc.com.au\/index.php\/2025\/09\/21\/run-pytorch-in-net-with-torchsharp\/#primaryimage"},"image":{"@id":"https:\/\/www.cloudproinc.com.au\/index.php\/2025\/09\/21\/run-pytorch-in-net-with-torchsharp\/#primaryimage"},"thumbnailUrl":"\/wp-content\/uploads\/2025\/09\/practical-ways-to-run-pytorch-in-net-with-torchsharp-and-more.png","datePublished":"2025-09-21T06:55:42+00:00","dateModified":"2025-09-21T06:55:44+00:00","description":"Learn how to run PyTorch in .NET with TorchSharp effectively and choose the best approach for your ML projects.","breadcrumb":{"@id":"https:\/\/www.cloudproinc.com.au\/index.php\/2025\/09\/21\/run-pytorch-in-net-with-torchsharp\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.cloudproinc.com.au\/index.php\/2025\/09\/21\/run-pytorch-in-net-with-torchsharp\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.cloudproinc.com.au\/index.php\/2025\/09\/21\/run-pytorch-in-net-with-torchsharp\/#primaryimage","url":"\/wp-content\/uploads\/2025\/09\/practical-ways-to-run-pytorch-in-net-with-torchsharp-and-more.png","contentUrl":"\/wp-content\/uploads\/2025\/09\/practical-ways-to-run-pytorch-in-net-with-torchsharp-and-more.png","width":1536,"height":1024},{"@type":"BreadcrumbList","@id":"https:\/\/www.cloudproinc.com.au\/index.php\/2025\/09\/21\/run-pytorch-in-net-with-torchsharp\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/cloudproinc.azurewebsites.net\/"},{"@type":"ListItem","position":2,"name":"Run PyTorch in .NET with TorchSharp"}]},{"@type":"WebSite","@id":"https:\/\/cloudproinc.azurewebsites.net\/#website","url":"https:\/\/cloudproinc.azurewebsites.net\/","name":"Cloud Pro Inc - CPI Consulting Pty Ltd","description":"Cloud, AI &amp; Cybersecurity Consulting | Melbourne","publisher":{"@id":"https:\/\/cloudproinc.azurewebsites.net\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/cloudproinc.azurewebsites.net\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/cloudproinc.azurewebsites.net\/#organization","name":"Cloud Pro Inc - Cloud Pro Inc - CPI Consulting Pty Ltd","url":"https:\/\/cloudproinc.azurewebsites.net\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/cloudproinc.azurewebsites.net\/#\/schema\/logo\/image\/","url":"\/wp-content\/uploads\/2022\/01\/favfinalfile.png","contentUrl":"\/wp-content\/uploads\/2022\/01\/favfinalfile.png","width":500,"height":500,"caption":"Cloud Pro Inc - Cloud Pro Inc - CPI Consulting Pty Ltd"},"image":{"@id":"https:\/\/cloudproinc.azurewebsites.net\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/cloudproinc.azurewebsites.net\/#\/schema\/person\/192eeeb0ce91062126ce3822ae88fe6e","name":"CPI Staff","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/2d96eeb53b791d92c8c50dd667e3beec92c93253bb6ff21c02cfa8ca73665c70?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/2d96eeb53b791d92c8c50dd667e3beec92c93253bb6ff21c02cfa8ca73665c70?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/2d96eeb53b791d92c8c50dd667e3beec92c93253bb6ff21c02cfa8ca73665c70?s=96&d=mm&r=g","caption":"CPI Staff"},"sameAs":["http:\/\/www.cloudproinc.com.au"],"url":"https:\/\/cloudproinc.azurewebsites.net\/index.php\/author\/cpiadmin\/"}]}},"jetpack_featured_media_url":"\/wp-content\/uploads\/2025\/09\/practical-ways-to-run-pytorch-in-net-with-torchsharp-and-more.png","jetpack-related-posts":[{"id":53865,"url":"https:\/\/cloudproinc.azurewebsites.net\/index.php\/2025\/09\/15\/loading-and-saving-pytorch-weights\/","url_meta":{"origin":53914,"position":0},"title":"Loading and Saving PyTorch Weights","author":"CPI Staff","date":"September 15, 2025","format":false,"excerpt":"Learn practical, safe patterns for saving, loading, and resuming PyTorch models. We cover state_dicts, checkpoints, device mapping, distributed training, and common pitfalls.","rel":"","context":"In &quot;AI&quot;","block_context":{"text":"AI","link":"https:\/\/cloudproinc.azurewebsites.net\/index.php\/category\/ai\/"},"img":{"alt_text":"","src":"\/wp-content\/uploads\/2025\/09\/best-practices-for-loading-and-saving-pytorch-weights-in-production.png","width":350,"height":200,"srcset":"\/wp-content\/uploads\/2025\/09\/best-practices-for-loading-and-saving-pytorch-weights-in-production.png 1x, \/wp-content\/uploads\/2025\/09\/best-practices-for-loading-and-saving-pytorch-weights-in-production.png 1.5x, \/wp-content\/uploads\/2025\/09\/best-practices-for-loading-and-saving-pytorch-weights-in-production.png 2x, \/wp-content\/uploads\/2025\/09\/best-practices-for-loading-and-saving-pytorch-weights-in-production.png 3x, \/wp-content\/uploads\/2025\/09\/best-practices-for-loading-and-saving-pytorch-weights-in-production.png 4x"},"classes":[]},{"id":53547,"url":"https:\/\/cloudproinc.azurewebsites.net\/index.php\/2025\/07\/26\/understanding-the-softmax-function-in-ai\/","url_meta":{"origin":53914,"position":1},"title":"Understanding the Softmax Function in AI","author":"CPI Staff","date":"July 26, 2025","format":false,"excerpt":"The softmax function is a cornerstone of machine learning, especially in tasks involving classification. It transforms raw prediction scores (logits) into probabilities, making them easy to interpret and use for decision-making. This blog post will dive deep into what the softmax function is, why it\u2019s important, and how to effectively\u2026","rel":"","context":"In &quot;AI&quot;","block_context":{"text":"AI","link":"https:\/\/cloudproinc.azurewebsites.net\/index.php\/category\/ai\/"},"img":{"alt_text":"","src":"\/wp-content\/uploads\/2025\/07\/create-a-featured-image-for-a-blog-post-about-the.png","width":350,"height":200,"srcset":"\/wp-content\/uploads\/2025\/07\/create-a-featured-image-for-a-blog-post-about-the.png 1x, \/wp-content\/uploads\/2025\/07\/create-a-featured-image-for-a-blog-post-about-the.png 1.5x, \/wp-content\/uploads\/2025\/07\/create-a-featured-image-for-a-blog-post-about-the.png 2x"},"classes":[]},{"id":53520,"url":"https:\/\/cloudproinc.azurewebsites.net\/index.php\/2025\/07\/21\/running-pytorch-in-microsoft-azure-machine-learning\/","url_meta":{"origin":53914,"position":2},"title":"Running PyTorch in Microsoft Azure Machine Learning","author":"CPI Staff","date":"July 21, 2025","format":false,"excerpt":"This post will walk you through what PyTorch is, how it's used in ML and LLM development, and how you can start running it in Azure ML using Jupyter notebooks. If you're working on deep learning, computer vision, or building large language models (LLMs), you've probably come across PyTorch. But\u2026","rel":"","context":"In &quot;AI&quot;","block_context":{"text":"AI","link":"https:\/\/cloudproinc.azurewebsites.net\/index.php\/category\/ai\/"},"img":{"alt_text":"","src":"\/wp-content\/uploads\/2025\/05\/Add-bootstrap-logo.png","width":350,"height":200,"srcset":"\/wp-content\/uploads\/2025\/05\/Add-bootstrap-logo.png 1x, \/wp-content\/uploads\/2025\/05\/Add-bootstrap-logo.png 1.5x, \/wp-content\/uploads\/2025\/05\/Add-bootstrap-logo.png 2x, \/wp-content\/uploads\/2025\/05\/Add-bootstrap-logo.png 3x, \/wp-content\/uploads\/2025\/05\/Add-bootstrap-logo.png 4x"},"classes":[]},{"id":53929,"url":"https:\/\/cloudproinc.azurewebsites.net\/index.php\/2025\/09\/25\/turn-a-list-into-a-tensor-in-python\/","url_meta":{"origin":53914,"position":3},"title":"Turn a List into a Tensor in Python","author":"CPI Staff","date":"September 25, 2025","format":false,"excerpt":"Learn how to convert Python lists into tensors using NumPy, PyTorch, and TensorFlow, with tips on shapes, dtypes, performance, and common pitfalls.","rel":"","context":"In &quot;AI&quot;","block_context":{"text":"AI","link":"https:\/\/cloudproinc.azurewebsites.net\/index.php\/category\/ai\/"},"img":{"alt_text":"","src":"\/wp-content\/uploads\/2025\/09\/turn-a-list-into-a-tensor-in-python-with-numpy-pytorch-tensorflow.png","width":350,"height":200,"srcset":"\/wp-content\/uploads\/2025\/09\/turn-a-list-into-a-tensor-in-python-with-numpy-pytorch-tensorflow.png 1x, \/wp-content\/uploads\/2025\/09\/turn-a-list-into-a-tensor-in-python-with-numpy-pytorch-tensorflow.png 1.5x, \/wp-content\/uploads\/2025\/09\/turn-a-list-into-a-tensor-in-python-with-numpy-pytorch-tensorflow.png 2x, \/wp-content\/uploads\/2025\/09\/turn-a-list-into-a-tensor-in-python-with-numpy-pytorch-tensorflow.png 3x, \/wp-content\/uploads\/2025\/09\/turn-a-list-into-a-tensor-in-python-with-numpy-pytorch-tensorflow.png 4x"},"classes":[]},{"id":53539,"url":"https:\/\/cloudproinc.azurewebsites.net\/index.php\/2025\/07\/25\/understanding-transformers-the-architecture-driving-ai-innovation\/","url_meta":{"origin":53914,"position":4},"title":"Understanding Transformers: The Architecture Driving AI Innovation","author":"CPI Staff","date":"July 25, 2025","format":false,"excerpt":"In this blog post titled \"Understanding Transformers: The Architecture Driving AI Innovation,\" we'll delve into what Transformer architecture is, how it works, the essential tools we use to build transformer-based models, some technical insights, and practical examples to illustrate its impact and utility. The Transformer architecture has revolutionized the field\u2026","rel":"","context":"In &quot;AI&quot;","block_context":{"text":"AI","link":"https:\/\/cloudproinc.azurewebsites.net\/index.php\/category\/ai\/"},"img":{"alt_text":"","src":"\/wp-content\/uploads\/2025\/07\/create-a-highly-detailed-high-resolution-image-depicting-the-transformer-architecture.png","width":350,"height":200,"srcset":"\/wp-content\/uploads\/2025\/07\/create-a-highly-detailed-high-resolution-image-depicting-the-transformer-architecture.png 1x, \/wp-content\/uploads\/2025\/07\/create-a-highly-detailed-high-resolution-image-depicting-the-transformer-architecture.png 1.5x, \/wp-content\/uploads\/2025\/07\/create-a-highly-detailed-high-resolution-image-depicting-the-transformer-architecture.png 2x"},"classes":[]},{"id":53930,"url":"https:\/\/cloudproinc.azurewebsites.net\/index.php\/2025\/09\/25\/deploying-deep-learning-models\/","url_meta":{"origin":53914,"position":5},"title":"Deploying Deep Learning Models","author":"CPI Staff","date":"September 25, 2025","format":false,"excerpt":"A practical guide to serving deep learning models as secure, scalable REST APIs using FastAPI, Docker, and Kubernetes\u2014covering performance, security, and monitoring for production.","rel":"","context":"In &quot;AI&quot;","block_context":{"text":"AI","link":"https:\/\/cloudproinc.azurewebsites.net\/index.php\/category\/ai\/"},"img":{"alt_text":"","src":"\/wp-content\/uploads\/2025\/09\/deploying-deep-learning-models-as-fast-secure-rest-apis-in-production.png","width":350,"height":200,"srcset":"\/wp-content\/uploads\/2025\/09\/deploying-deep-learning-models-as-fast-secure-rest-apis-in-production.png 1x, \/wp-content\/uploads\/2025\/09\/deploying-deep-learning-models-as-fast-secure-rest-apis-in-production.png 1.5x, \/wp-content\/uploads\/2025\/09\/deploying-deep-learning-models-as-fast-secure-rest-apis-in-production.png 2x, \/wp-content\/uploads\/2025\/09\/deploying-deep-learning-models-as-fast-secure-rest-apis-in-production.png 3x, \/wp-content\/uploads\/2025\/09\/deploying-deep-learning-models-as-fast-secure-rest-apis-in-production.png 4x"},"classes":[]}],"jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/cloudproinc.azurewebsites.net\/index.php\/wp-json\/wp\/v2\/posts\/53914","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/cloudproinc.azurewebsites.net\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/cloudproinc.azurewebsites.net\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/cloudproinc.azurewebsites.net\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/cloudproinc.azurewebsites.net\/index.php\/wp-json\/wp\/v2\/comments?post=53914"}],"version-history":[{"count":2,"href":"https:\/\/cloudproinc.azurewebsites.net\/index.php\/wp-json\/wp\/v2\/posts\/53914\/revisions"}],"predecessor-version":[{"id":53917,"href":"https:\/\/cloudproinc.azurewebsites.net\/index.php\/wp-json\/wp\/v2\/posts\/53914\/revisions\/53917"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/cloudproinc.azurewebsites.net\/index.php\/wp-json\/wp\/v2\/media\/53915"}],"wp:attachment":[{"href":"https:\/\/cloudproinc.azurewebsites.net\/index.php\/wp-json\/wp\/v2\/media?parent=53914"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/cloudproinc.azurewebsites.net\/index.php\/wp-json\/wp\/v2\/categories?post=53914"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/cloudproinc.azurewebsites.net\/index.php\/wp-json\/wp\/v2\/tags?post=53914"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}