Listing Thumbnail

    Ollama with Open WebUI with support by Elm Computing

     Info
    Deployed on AWS
    Free Trial
    AWS Free Tier
    This product has charges associated with it for seller support. This image comes with a prebuilt Ubuntu 26.04 AMI for private local LLM experimentation on EC2 with Ollama, Open WebUI, and a small preloaded validation model.

    Overview

    This prebuilt AMI provides a ready-to-run local LLM environment on AWS EC2. It combines Ollama for local model serving and Open WebUI for a browser interface to the Ollama backend.

    Open Source Disclaimer

    This is a repackaged open source software product wherein additional charges apply for support by Elm Computing.

    Open source components included in this image remain subject to their respective upstream licenses. Elm Computing provides packaging, configuration, documentation, maintenance, and support for this AMI; it does not claim ownership of upstream open source projects included in the image.

    Disclaimer: All trademarks referenced in this listing belong to their respective owners. Their use does not imply any affiliation with or endorsement by the trademark holders.

    What Is Included

    • Ollama local model server.
    • Open WebUI browser interface.
    • Preloaded smollm2:135m model for immediate validation.

    Typical Use Cases

    • Evaluate local LLM workflows without wiring external model APIs.
    • Run a private Ollama endpoint inside an AWS account.
    • Provide a simple web UI for users testing Ollama-hosted models.

    Notes

    • The included model is intentionally small. Pull larger Ollama models after launch if your instance type has enough memory, CPU, and disk capacity.
    • Ollama uses CPU inference by default in this image.
    • Open WebUI listens on port 8080.
    • Ollama listens locally on port 11434; keep it local and use SSH tunneling for direct API access.
    • For production deployments, configure secrets, access controls, and security groups according to your organization's requirements.

    Highlights

    • Easy-to-launch AI server with web GUI access
    • Elm Computing Support

    Details

    Delivery method

    Delivery option
    64-bit (x86) Amazon Machine Image (AMI)

    Latest version

    Operating system
    Ubuntu 26.04

    Deployed on AWS
    New

    Introducing multi-product solutions

    You can now purchase comprehensive solutions tailored to use cases and industries.

    Multi-product solutions

    Features and programs

    Financing for AWS Marketplace purchases

    AWS Marketplace now accepts line of credit payments through the PNC Vendor Finance program. This program is available to select AWS customers in the US, excluding NV, NC, ND, TN, & VT.
    Financing for AWS Marketplace purchases

    Pricing

    Free trial

    Try this product free for 5 days according to the free trial terms set by the vendor. Usage-based pricing is in effect for usage beyond the free trial terms. Your free trial gets automatically converted to a paid subscription when the trial ends, but may be canceled any time before that.

    Ollama with Open WebUI with support by Elm Computing

     Info
    Pricing is based on actual usage, with charges varying according to how much you consume. Subscriptions have no end date and may be canceled any time.
    Additional AWS infrastructure costs may apply. Use the AWS Pricing Calculator  to estimate your infrastructure costs.
    If you are an AWS Free Tier customer with a free plan, you are eligible to subscribe to this offer. You can use free credits to cover the cost of eligible AWS infrastructure. See AWS Free Tier  for more details. If you created an AWS account before July 15th, 2025, and qualify for the Legacy AWS Free Tier, Amazon EC2 charges for Micro instances are free for up to 750 hours per month. See Legacy AWS Free Tier  for more details.

    Usage costs (383)

     Info
    • ...
    Dimension
    Cost/hour
    t3a.medium
    Recommended
    $0.025
    t3.micro
    $0.0125
    r8i.metal-48xl
    $0.05
    hpc7a.96xlarge
    $0.05
    c7a.16xlarge
    $0.05
    c8a.48xlarge
    $0.05
    m5d.large
    $0.05
    r6in.12xlarge
    $0.05
    c5n.large
    $0.05
    m5zn.large
    $0.05

    Vendor refund policy

    Refunds are generally not available. Instances are billed hourly based on actual usage and can be terminated at any time to stop charges.

    How can we make this page better?

    Tell us how we can improve this page, or report an issue with this product.
    Tell us how we can improve this page, or report an issue with this product.

    Legal

    Vendor terms and conditions

    Upon subscribing to this product, you must acknowledge and agree to the terms and conditions outlined in the vendor's End User License Agreement (EULA) .

    Content disclaimer

    Vendors are responsible for their product descriptions and other product content. AWS does not warrant that vendors' product descriptions or other product content are accurate, complete, reliable, current, or error-free.

    Usage information

     Info

    Delivery details

    64-bit (x86) Amazon Machine Image (AMI)

    Amazon Machine Image (AMI)

    An AMI is a virtual image that provides the information required to launch an instance. Amazon EC2 (Elastic Compute Cloud) instances are virtual servers on which you can run your applications and workloads, offering varying combinations of CPU, memory, storage, and networking resources. You can launch as many instances from as many different AMIs as you need.

    Version release notes

    Ollama and openwebui on Ubuntu 26.04

    Additional details

    Usage instructions

    Browser Access

    Use this path to open the web interface and start chatting with the preloaded model.

    Launch

    1. Launch an EC2 instance from the published AMI.
    2. Choose t3.medium or larger for basic validation with the included smollm2:135m model. Larger models require larger instances and more disk capacity.
    3. In the security group, allow Open WebUI only from trusted IP addresses.
    4. Allow TCP port 8080 from your IP address or trusted network range.
    5. Do not expose the Ollama API port 11434 publicly.

    Open WebUI

    After the instance is running, copy the EC2 instance public IPv4 address from the AWS console.

    Open this URL in your browser:

    URL: http://INSTANCE_PUBLIC_IP:8080

    Replace INSTANCE_PUBLIC_IP with the EC2 instance public IPv4 address shown in the AWS console.

    On first access, create the initial Open WebUI account. The image includes the small smollm2:135m model so you can validate chat immediately.

    If the page does not load, confirm that the instance is running and that the security group allows TCP port 8080 from your current IP address.

    Private Access And Administration

    Use this section for command-line access, private access through SSH tunnels, direct Ollama API access, or additional model management.

    SSH Access

    Use the SSH command shown in the EC2 instance console. In the AWS console, select the instance, choose Connect, open the SSH client tab, and copy the generated command.

    The command will look similar to this:

    Command: ssh -i KEY_PAIR.pem ubuntu@INSTANCE_PUBLIC_IP

    Replace KEY_PAIR.pem with your private key file and INSTANCE_PUBLIC_IP with the EC2 instance public IPv4 address.

    Open WebUI Through SSH Tunnel

    If you do not want to expose TCP port 8080, use an SSH tunnel from your workstation.

    Add this option to the SSH command from the EC2 console:

    Option: -L 8080:127.0.0.1:8080

    The full command will look similar to this:

    Command: ssh -i KEY_PAIR.pem -L 8080:127.0.0.1:8080 ubuntu@INSTANCE_PUBLIC_IP

    Then open this local URL from your workstation:

    URL: http://127.0.0.1:8080 

    If local port 8080 is already in use, choose another local port such as 18080.

    Command: ssh -i KEY_PAIR.pem -L 18080:127.0.0.1:8080 ubuntu@INSTANCE_PUBLIC_IP

    Then open this local URL from your workstation:

    URL: http://127.0.0.1:18080 

    Ollama API Through SSH Tunnel

    Ollama listens locally on the instance at this URL:

    URL: http://127.0.0.1:11434 

    Keep port 11434 private. To access the Ollama API from your workstation, use an SSH tunnel.

    Add this option to the SSH command from the EC2 console:

    Option: -L 11434:127.0.0.1:11434

    The full command will look similar to this:

    Command: ssh -i KEY_PAIR.pem -L 11434:127.0.0.1:11434 ubuntu@INSTANCE_PUBLIC_IP

    Then call Ollama locally from your workstation:

    Command: curl http://127.0.0.1:11434/api/tags 

    Validate From The Instance

    From an SSH session on the instance, check the preloaded model.

    Command: ollama list

    From an SSH session on the instance, run a simple prompt.

    Command: ollama run smollm2:135m "Reply with OK only."

    From an SSH session on the instance, check the Ollama API.

    Command: curl http://127.0.0.1:11434/api/tags 

    From an SSH session on the instance, check Open WebUI.

    Command: curl -I http://127.0.0.1:8080 

    Pull Additional Models

    From an SSH session on the instance, use Ollama to pull additional models after launch.

    Command: ollama pull llama3.2:1b

    Command: ollama list

    Choose model sizes that fit the instance memory and disk capacity. The root volume is intended for basic use and validation; increase storage before pulling larger models or maintaining multiple model copies. If model generation is slow, use a larger instance type or pull a smaller model. CPU inference speed depends heavily on instance size and model size.

    Support

    Vendor support

    Please do not hesitate to contact us at support@elmcomputing.io  if you have any questions.

    AWS infrastructure support

    AWS Support is a one-on-one, fast-response support channel that is staffed 24x7x365 with experienced and technical support engineers. The service helps customers of all sizes and technical abilities to successfully utilize the products and features provided by Amazon Web Services.

    Similar products

    Customer reviews

    Ratings and reviews

     Info
    0 ratings
    5 star
    4 star
    3 star
    2 star
    1 star
    0%
    0%
    0%
    0%
    0%
    0 reviews
    No customer reviews yet
    Be the first to review this product . We've partnered with PeerSpot to gather customer feedback. You can share your experience by writing or recording a review, or scheduling a call with a PeerSpot analyst.