Listing Thumbnail

    CPU LLM Server with Ollama and Open WebUI by Prezelf

     Info
    Sold by: Prezelfy 
    Deployed on AWS
    AWS Free Tier
    This product has charges associated with it for packaging, hardening, preconfiguration, and maintenance of a CPU-based LLM server with Ollama and Open WebUI.

    Overview

    This is a repackaged open source software product wherein additional charges apply for packaging, hardening, preconfiguration, and maintenance of a CPU-based LLM server deployment.

    The AMI includes Ollama and Open WebUI configured to run as system services, firewall rules for required ports, and a curated set of preloaded CPU-friendly models for faster initial use. Additional value is provided through deployment-ready defaults, integration setup, and ongoing image maintenance updates so customers can launch and operate the stack with reduced setup effort. The packaged deployment provides an Ollama API service on port 11434 and an Open WebUI service on port 8080, with services configured for startup integration and operational continuity. Open-source components remain available under their respective licenses.

    Highlights

    • Security Best Practices: The image is hardened with CIS-aligned baseline controls, unnecessary services reduced, and secure-by-default configuration applied for cloud deployment. This provides a stronger starting point for teams that require improved security posture from first boot.
    • Regular Maintenance Updates: This AMI is updated regularly to include current package updates, security patches, and tested configuration improvements across the operating system and bundled components. The goal is to reduce exposure to stale dependencies while keeping launches reliable.
    • Preconfigured LLM Stack: Ollama and Open WebUI are preinstalled and configured as system services with required firewall rules in place. The deployment includes curated CPU-friendly models so users can start inference and UI-based interaction quickly without manual installation and service wiring.

    Details

    Sold by

    Delivery method

    Delivery option
    64-bit (x86) Amazon Machine Image (AMI)

    Latest version

    Operating system
    AmazonLinux 2023

    Deployed on AWS
    New

    Introducing multi-product solutions

    You can now purchase comprehensive solutions tailored to use cases and industries.

    Multi-product solutions

    Features and programs

    Financing for AWS Marketplace purchases

    AWS Marketplace now accepts line of credit payments through the PNC Vendor Finance program. This program is available to select AWS customers in the US, excluding NV, NC, ND, TN, & VT.
    Financing for AWS Marketplace purchases

    Pricing

    CPU LLM Server with Ollama and Open WebUI by Prezelf

     Info
    Pricing is based on actual usage, with charges varying according to how much you consume. Subscriptions have no end date and may be canceled any time.
    Additional AWS infrastructure costs may apply. Use the AWS Pricing Calculator  to estimate your infrastructure costs.
    If you are an AWS Free Tier customer with a free plan, you are eligible to subscribe to this offer. You can use free credits to cover the cost of eligible AWS infrastructure. See AWS Free Tier  for more details. If you created an AWS account before July 15th, 2025, and qualify for the Legacy AWS Free Tier, Amazon EC2 charges for Micro instances are free for up to 750 hours per month. See Legacy AWS Free Tier  for more details.

    Usage costs (199)

     Info
    • ...
    Dimension
    Cost/hour
    t3.large
    Recommended
    $0.04
    t2.micro
    $0.04
    t3.micro
    $0.04
    r8i-flex.large
    $0.04
    r8i.large
    $0.04
    r8id.xlarge
    $0.04
    r5.large
    $0.04
    r5d.xlarge
    $0.04
    r3.xlarge
    $0.04
    r7i.large
    $0.04

    Vendor refund policy

    All sales are final. Due to the nature of digital infrastructure products, we do not offer refunds once the AMI has been launched. Please review the product details and usage instructions carefully before purchase. If you experience technical issues, our support team is available to assist at support@prezelfy.com .

    How can we make this page better?

    We'd like to hear your feedback and ideas on how to improve this page.
    We'd like to hear your feedback and ideas on how to improve this page.

    Legal

    Vendor terms and conditions

    Upon subscribing to this product, you must acknowledge and agree to the terms and conditions outlined in the vendor's End User License Agreement (EULA) .

    Content disclaimer

    Vendors are responsible for their product descriptions and other product content. AWS does not warrant that vendors' product descriptions or other product content are accurate, complete, reliable, current, or error-free.

    Usage information

     Info

    Delivery details

    64-bit (x86) Amazon Machine Image (AMI)

    Amazon Machine Image (AMI)

    An AMI is a virtual image that provides the information required to launch an instance. Amazon EC2 (Elastic Compute Cloud) instances are virtual servers on which you can run your applications and workloads, offering varying combinations of CPU, memory, storage, and networking resources. You can launch as many instances from as many different AMIs as you need.

    Version release notes

    his release introduces a preconfigured CPU-based LLM server stack that combines Ollama and Open WebUI in a deployment-ready AMI. The image includes service wiring, startup configuration, and network defaults for faster launch and reduced manual setup.

    What is included in this version: Ollama installed and configured as a system service for local model serving. Open WebUI installed and configured as a system service for browser-based interaction. Curated CPU-friendly model set preloaded during build, with validation to ensure required models are present. First-boot model preload fallback to recover automatically if models are missing at launch time. Persistent and explicit model storage path configuration for consistent runtime behavior. Firewall configuration for required ports, including Open WebUI on 8080 and Ollama API on 11434. Improved service reliability updates, including corrected Ollama binary path resolution and runtime dependency handling. Operational hardening and baseline cloud-ready defaults for production-oriented deployments.

    User-visible outcome: Faster time to first prompt with reduced post-launch configuration. Consistent service startup across reboots. Improved reliability of model availability in Open WebUI and Ollama API.

    Additional details

    Usage instructions

    Connect via SSH using your EC2 key pair: ssh ec2-user@<public-ip>. Step-by-step instructions:: https://www.prezelfy.com/ai-on-cpu 

    Support

    Vendor support

    For support, inquiries, or specific requests, contact support@prezelfy.com .

    AWS infrastructure support

    AWS Support is a one-on-one, fast-response support channel that is staffed 24x7x365 with experienced and technical support engineers. The service helps customers of all sizes and technical abilities to successfully utilize the products and features provided by Amazon Web Services.

    Similar products

    Customer reviews

    Ratings and reviews

     Info
    0 ratings
    5 star
    4 star
    3 star
    2 star
    1 star
    0%
    0%
    0%
    0%
    0%
    0 reviews
    No customer reviews yet
    Be the first to review this product . We've partnered with PeerSpot to gather customer feedback. You can share your experience by writing or recording a review, or scheduling a call with a PeerSpot analyst.