Listing Thumbnail

    Ollama LLM Inference Server Support by Softlunda

     Info
    Sold by: Softlunda 
    Deployed on AWS
    This product has charges associated with it for seller support. Pre-configured Amazon EC2 AMI for secure deployment of Ollama large language model (LLM) inference workloads.

    Overview

    This is a repackaged open source software product wherein additional charges apply for support.

    This Amazon Machine Image (AMI) provides a pre-configured and secure environment for running large language model (LLM) inference workloads using Ollama on Amazon EC2.

    Built on Ubuntu 24.04 LTS, the AMI includes all required dependencies to help users deploy AI inference services quickly without manual setup. It is designed for developers, startups, and organizations that need a simple and reliable way to run open-source language models for inference use cases.

    After launch, users can pull and manage supported models directly using Ollama based on their application requirements. No AI models, proprietary datasets, or licensed third-party software are included with this AMI. All data processing remains within the customer AWS account.

    This AMI follows AWS security best practices and is suitable for development, testing, and production inference workloads.

    Highlights

    • Pre-configured Ollama inference environment
    • Official Docker and Ollama installations
    • No preloaded AI models

    Details

    Categories

    Delivery method

    Delivery option
    64-bit (x86) Amazon Machine Image (AMI)

    Latest version

    Operating system
    Ubuntu 24.04

    Deployed on AWS
    New

    Introducing multi-product solutions

    You can now purchase comprehensive solutions tailored to use cases and industries.

    Multi-product solutions

    Features and programs

    Financing for AWS Marketplace purchases

    AWS Marketplace now accepts line of credit payments through the PNC Vendor Finance program. This program is available to select AWS customers in the US, excluding NV, NC, ND, TN, & VT.
    Financing for AWS Marketplace purchases

    Pricing

    Ollama LLM Inference Server Support by Softlunda

     Info
    Pricing is based on actual usage, with charges varying according to how much you consume. Subscriptions have no end date and may be canceled any time. Alternatively, you can pay upfront for a contract, which typically covers your anticipated usage for the contract duration. Any usage beyond contract will incur additional usage-based costs.
    Additional AWS infrastructure costs may apply. Use the AWS Pricing Calculator  to estimate your infrastructure costs.

    Usage costs (8)

     Info
    Dimension
    Cost/hour
    c7i.large
    Recommended
    $0.08
    t3.large
    $0.08
    m7i-flex.large
    $0.08
    t3.medium
    $0.08
    t3.small
    $0.08
    t3a.medium
    $0.08
    m7i.large
    $0.08
    c7i-flex.large
    $0.08

    Vendor refund policy

    No refunds are offered for this product.

    How can we make this page better?

    We'd like to hear your feedback and ideas on how to improve this page.
    We'd like to hear your feedback and ideas on how to improve this page.

    Legal

    Vendor terms and conditions

    Upon subscribing to this product, you must acknowledge and agree to the terms and conditions outlined in the vendor's End User License Agreement (EULA) .

    Content disclaimer

    Vendors are responsible for their product descriptions and other product content. AWS does not warrant that vendors' product descriptions or other product content are accurate, complete, reliable, current, or error-free.

    Usage information

     Info

    Delivery details

    64-bit (x86) Amazon Machine Image (AMI)

    Amazon Machine Image (AMI)

    An AMI is a virtual image that provides the information required to launch an instance. Amazon EC2 (Elastic Compute Cloud) instances are virtual servers on which you can run your applications and workloads, offering varying combinations of CPU, memory, storage, and networking resources. You can launch as many instances from as many different AMIs as you need.

    Version release notes

    Initial release of Ollama LLM Inference Server Built on Ubuntu 24.04 LTS Pre-installed and configured Ollama for LLM inference workloads

    Additional details

    Usage instructions

    Start the instance with 1-Click Launch the EC2 instance directly from AWS Marketplace using the provided AMI. Access to the Ollama LLM Server is via SSH only As part of the EC2 instance provisioning process, an SSH key pair is associated with the instance. This private key is available only at the time of key pair creation and cannot be retrieved later, so ensure it is stored securely. To access the instance, you will need the following details: Private key: Created during the EC2 instance provisioning process Username: ubuntu SSH Port: 22

    Ollama Service: Ollama is pre-installed and configured on this AMI and starts automatically on boot. Once connected, you can begin pulling and running models immediately.

    Support

    Vendor support

    Support Email: support@softlunda.com 

    Support requests are typically responded to during standard business hours. Additional paid support options may be available upon request.

    AWS infrastructure support

    AWS Support is a one-on-one, fast-response support channel that is staffed 24x7x365 with experienced and technical support engineers. The service helps customers of all sizes and technical abilities to successfully utilize the products and features provided by Amazon Web Services.

    Similar products

    Customer reviews

    Ratings and reviews

     Info
    0 ratings
    5 star
    4 star
    3 star
    2 star
    1 star
    0%
    0%
    0%
    0%
    0%
    0 reviews
    No customer reviews yet
    Be the first to review this product . We've partnered with PeerSpot to gather customer feedback. You can share your experience by writing or recording a review, or scheduling a call with a PeerSpot analyst.