My main use case for BrowserStack involves working on campaign testing for a healthcare product, where the application supports both Android and iOS devices for responsiveness and also supports the iPad device. We generally test three devices: for Android, we consider the Samsung S23, for iPhone, we choose iPhone 12, and for the iPad, we test on the latest iPad available on BrowserStack with both landscape and portrait views.
BrowserStack
BrowserStackExternal reviews
External reviews are not included in the AWS star rating for the product.
Testing your apps for all browsers
Effortless Testing with Real Devices
Effortless Cross-Browser Testing with Seamless Automation Integration
Effortless Parallel Testing and Seamless Integrations
Cross-device UX testing has become seamless and now supports faster, higher-quality releases
What is our primary use case?
What is most valuable?
BrowserStack helps me handle challenges with device fragmentation or OS updates in my workflow through installation and uninstallation features. Whenever new updates come from Okta, we just update that. Additionally, when the application crashes or devices crash, BrowserStack provides multi-device testing; I just have to paste the URL in different devices, start different sessions, and it will be handled, allowing us to test everything simultaneously, which is really useful. The OS updates are automatically managed from Okta, so these things are well-handled.
In my day-to-day work, we generally do manual testing with BrowserStack, checking all UI components for responsiveness. The main purpose is to verify the responsiveness of our application, which includes checking the UI, mainly the fonts, alignment, margins, and all aspects to see how the content looks on different devices and how it fits within various screen sizes.
The best features that BrowserStack offers include both manual and automation testing; live devices reduce our real device availability. For automation, we utilize the Bellatrix framework integrated with BrowserStack, allowing it to run automatically on BrowserStack Android devices. I am mostly impressed with the automation because when I run everything, they provide logs and screenshots for each and every step, which is very helpful.
We are also using the AWS pipeline for generating reports on the automation side, where we set up AWS to easily get reports from BrowserStack and share them with all stakeholders automatically.
BrowserStack has positively impacted my organization primarily through time savings because it is very easy to use and replicates physical devices for testing, which is crucial since we usually do not have physical devices. This saves us time, and when we test the application's UI and UX, it helps identify more bugs with different devices and sizes, improving the overall quality of the software.
What needs improvement?
One improvement I observe is that iOS automation is not feasible due to some configuration issues, although it serves as a great tool for cross-browser testing. However, specific iPad or iOS automation requires a lot of setup, and we need to find a way to simplify the configuration for iOS automation using BrowserStack.
Regarding needed improvements, I sometimes notice speed concerns where BrowserStack can be delayed in response, and from a security standpoint, some URLs are inaccessible, leading to potential security issues. While BrowserStack has almost everything, integrating a feature for test case execution would be beneficial, as it would reduce our manual test execution time.
For how long have I used the solution?
I have been using BrowserStack for six years, since 2017. I continuously use BrowserStack for cross-browser testing, as the company provides a license for that, and we are using BrowserStack for all of our cross-browser testing.
What other advice do I have?
My advice for others looking into using BrowserStack is to prefer it over physical device availability, as we mostly have two or three devices at most. BrowserStack offers numerous devices with different versions, allowing us to test our UI and perform responsive testing easily without manual setups on real devices and version updates.
I believe BrowserStack is a great tool that minimizes manual efforts, and I recommend it to everyone as it effectively addresses challenges associated with real devices. I would rate this product an eight out of ten.
Which deployment model are you using for this solution?
If public cloud, private cloud, or hybrid cloud, which cloud provider do you use?
Continuous cross-browser testing has reduced production defects and improves team collaboration
What is our primary use case?
The main use case for me using BrowserStack, which I have been doing for four years, is mostly around testing solutions supported for older devices and operating systems, such as iPhone 8 or iOS 12, especially for applications used by customers who do not have the most recent devices in the market and are still using Internet Explorer or earlier versions of Edge. This helps provide us devices and browsers for those legacy solutions.
When using BrowserStack for those legacy devices and browsers, the process typically involves using it for both manual and automation testing, allowing us to utilize APM as well as Selenium in both ways.
What is most valuable?
The best features BrowserStack offers for us include App Live, which has really helped us; the quick availability of real-time devices as soon as new ones are launched, such as when iPhone 17 was released; integration with project management tools including Jira and Slack, which is very handy; and access to network logs, something we have made good use of.
From a productivity standpoint, the integration with the wider ecosystem of project management tools has the biggest impact for us, specifically with Jira and Slack, as it helps us log tickets and bugs directly, providing evidence for the tickets we are logging. This was much slower before, especially when dealing with flaky applications or newly live releases that have numerous problems. The integration helps us quickly log bugs using the evidence provided by BrowserStack.
BrowserStack has positively impacted our organization by improving collaboration and showing quality improvements in releases, with the number of defects leaking into production significantly reduced.
We track two metrics: the number of bugs leaking to production per application and the number of customer support issues reported. We have seen a reduction of close to 20 to 25% in defects compared to past releases. When we started using BrowserStack, release four had about 150 bugs reported compared to 200 in release three. Incidents reported by end customers have also seen a reduction.
What needs improvement?
I think false positives are an area where BrowserStack can improve, as I have often seen things working fine on actual devices, but on BrowserStack devices, issues arise due to network slowness or AWS region connectivity problems that cause lag.
In addition to false positives and network slowness, feature improvements could include monitoring dashboards or consolidated dashboards for multiple releases across different domains, allowing us to see runs scheduled and link us to reports of passed and failed cases.
For how long have I used the solution?
I have been using BrowserStack for a total period of close to four years across two organizations.
What do I think about the stability of the solution?
BrowserStack is mostly stable for our needs, though sometimes there is slowness in the network, especially when working with AWS-based hosting.
What do I think about the scalability of the solution?
Currently, BrowserStack's scalability for our organization meets our needs as we have relatively limited use cases, and so far what we have scaled has worked fine for us.
How are customer service and support?
I have not had to interact with BrowserStack's customer support team, as most issues were addressed locally.
How would you rate customer service and support?
Which solution did I use previously and why did I switch?
Before using BrowserStack, we had used LambdaTest and physical devices as our prior solutions.
What was our ROI?
We have definitely seen a return on investment with BrowserStack, particularly in tracking the value realized per automated test case and time savings in testing apps across multiple clouds, browsers, and operating systems, leading to money savings since we previously had many resources engaged in that.
What's my experience with pricing, setup cost, and licensing?
The setup cost and licensing were handled at the enterprise level, as our bank is a large organization, and these central negotiations were managed by the finance team, so I have limited exposure to that.
Which other solutions did I evaluate?
Before choosing BrowserStack, we mostly evaluated LambdaTest as an option.
What other advice do I have?
We have a local version of BrowserStack for direct access, and we also access BrowserStack from AWS EC2 machines, providing us with both kinds of interactions available.
My advice for others looking into using BrowserStack is to evaluate options, perform an ROI calculation beforehand, and identify the specific use cases BrowserStack excels at, as this will lead to a much higher ROI return for your organization rather than using it for everything, including manual testing. I would rate my overall experience with BrowserStack an 8 out of 10.
Which deployment model are you using for this solution?
If public cloud, private cloud, or hybrid cloud, which cloud provider do you use?
Comprehensive Testing Tools for All Devices
Essential Tool for Cross-Browser and Device Testing
Effortless Testing with Smooth Workflows and Real-Device Coverage
Parallel execution actually saves my day during regression sprints. Instead of waiting hours, I wrap things up much faster.
Real-device coverage is impressive. Half of the issues we catch now come from devices we’d never be able to afford or maintain in-house.
The session recordings and live logs feel like someone recorded the entire investigation trail for me, super helpful when collaborating with devs.
When the dashboard gets busy, the UI feels a bit sluggish. Not a blocker, but noticeable.
Lightning-Fast Parallel Testing with Seamless CI Integration
Artifacts that tell the full story. Each session link bundles video, network/HAR, console, logcat/syslog, and device metadata, so debugging feels like being on the phone itself.
Stable CI integrations. With GitHub Actions and Jenkins triggers, every PR posts a pass/fail matrix to Slack and links back to the exact failing session, instant triage.
2. Tunnel drops on long tests under corporate proxies.