Automated data integrations have reduced failures and now support faster, reliable deliveries
What is our primary use case?
My main use case for SnapLogic is to build integrations between two different applications or systems, mostly to facilitate the integrations part.
I can give you an example of an integration project I have built using SnapLogic. I built one of the more complex integrations, which was near real-time customer data synchronization between the Salesforce on-premise ERP system using SnapLogic. Their goal was to make sure both systems stay aligned on customer records, orders, and their status updates without any human intervention. So, the main goal is to make it all automatic with SnapLogic process. I designed a triggered pipeline using a Salesforce listener tap that captures all the record changes, ensuring no human intervention is needed. This data flows through validations and transformation Snaps, where I standardize the formats, handle a few operations, and ensure full consistency with the data. I also implemented a reusable error handling sub-pipeline that logs the failures in monitoring databases and sends alerts through email or channel notifications. For the ERP side, I exposed SOAP services, configured the SOAP execute Snaps with dynamic requests, and generated the payload as well. I ensured performance optimization issues were addressed as the volume increased by batching requests and parallelizing the process. This integration is now fully automated and monitored, requiring no human intervention. This is one of the integration projects among many I have worked on with SnapLogic.
I have also handled various integration use cases with SnapLogic. I have built REST API pipelines used to expose backend security to external applications, utilizing API tasks, API policies, and pipeline parameters. I have focused on batch ETL data pipelines for migrating large datasets from databases, like Snowflake to other cases, using bulk Snaps throughout. Additionally, I have worked on event-driven integrations using ultra pipelines for low latencies. I have connected applications, integrating CRM to ERP and ERP to CRM, where I handled mapping, transformations, validations, and reconciliation reporting. Another use case is for file processing automation, particularly with automated ingestion of CSV, XML, and JSON files, where I parsed and validated the file structures before loading into databases and generating reports and success/error messages. Lastly, for error handling and monitoring frameworks, I built and logged failures to database log services, created alerts via email or Slack, and stored failed payloads for retrievability, ensuring data quality and transformation pipelines with standardized formats. These represent some of the many use cases I have worked on.
How has it helped my organization?
SnapLogic positively impacts my organization, mainly in three areas: speed, system reliability, and maintainability. Before adopting SnapLogic, integrations were either custom coded or handled through scripts, leading to fragility and scaling challenges. With SnapLogic's reusable pipelines and pre-built Snaps, development time for new integrations drops significantly, in some cases from weeks to just a few days, as I no longer need to rebuild connection logic from scratch. The built-in error handling improves reliability, with monitoring dashboards and retry mechanisms reducing production failures and providing visibility into pipeline performance, allowing me to detect and resolve issues proactively instead of relying on business user reports. Another significant improvement is maintainability; the visual and modular nature of pipelines simplifies onboarding for new team members, making it easier to learn and standardize parameterization.
Speaking of metrics, I would emphasize a specific example: in one of my integrations, I synchronized customer and order data between Salesforce and my ERP. Initially, builds took about two to three weeks with custom scripts and manual API logic. After transitioning to SnapLogic, I averaged this down to just days for similar integrations, primarily due to the reusable pipelines that streamline efforts. The majority of time savings stemmed from the pre-built Snaps, eliminating the need to write authentication or pagination logic anew. Regarding reliability, before SnapLogic, I experienced approximately eight integration failures per month due to timeout errors, schema mismatches, or unhandled null data. After implementing structured error handling pipelines, retries, and validation layers, this number has dropped to around two to three incidents monthly, mostly attributed to upstream system issues rather than pipeline failures. From a maintenance angle, onboarding new developers who previously needed weeks to confidently modify integrations has been dramatically reduced with SnapLogic's visual pipelines and standardized design approaches, leading to faster delivery, fewer production issues, and less time spent debugging.
What is most valuable?
SnapLogic offers numerous features that stand out. One of the key features is the pipeline designer visualization, where users can drag and drop components based on their use cases, making it user-friendly. SnapLogic execution transparency and preview at each data step provide vital information about how components work and their utility. Another highlight is SnapPacks, featuring pre-built connectors, which save time. There are hundreds of connectors for APIs, databases, SaaS applications, files, and messaging systems, with built-in authentication handling. Ultra pipelines boast impressive response times in milliseconds and are persistent in executing nodes. Additionally, SnapLogic's modular designs promote reusability, allowing developers to maintain structured development. The error handling frameworks also enable production-grade setups without needing custom frameworks, facilitating retry logic and hybrid architecture flexibility. SnapLogic excels in data transformations, monitoring, and observability, providing scalability controls for the pipelines.
I would like to highlight the expression language feature, which is primarily based on JavaScript and allows for logic to be embedded directly in the pipeline components. Its strength lies in dynamic routine logic, making it easy to write clean, efficient expressions for various use cases. Another notable feature is the pipeline execution mode offering options for trigger tasks, scheduled tasks, alter tasks, and execution patterns. This flexibility aids in designing execution strategies. SnapLogic also integrates metadata and schema handling, with automatic design and schema capabilities being significant differentiators. Experienced engineers truly understand schema stability with SnapLogic. Other features such as pipeline patterns, design, and scalability further contribute to its robustness. However, if I had to choose a favorite feature, it would be the reusable child pipelines with parameterization because it enforces standard logic, reduces duplication, and underscores SnapLogic's role as an integration platform rather than just a tool, allowing for team scaling and ensuring consistency.
What needs improvement?
While SnapLogic is powerful, there are several areas for improvement that could enhance user experience. Version control remains an area needing attention as it currently lacks effective features. Debugging complex pipelines can be painful, especially when dealing with deeply nested structures, making it difficult to trace data lineage across pipelines. Improvements in centralized execution and trace visualizations are also necessary. Furthermore, compared to other code-based tools, there is room for advancement in structured transformations, making this a critical area for improvement.
For how long have I used the solution?
I have been using SnapLogic for eight years.
How are customer service and support?
I would rate customer service as a 5.
How would you rate customer service and support?
What other advice do I have?
For monitoring and alerting my SnapLogic integrations, I utilize various dashboards. I implement a layered approach, conducting platform-level monitoring, pipeline-level logging, and proactive measures. Using the built-in dashboard for runtime metrics and execution histories provides operational visibility. I design pipelines with a centralized logging and alert framework, ensuring failures are immediately detected rather than discovered by users.
I manage versioning and deployment of pipelines using a structured promotion model across environments, including development, QA, and production. Pipelines are developed and tested in development projects and promoted to higher environments using SnapLogic's project export functionality. Environment-specific values remain externalized through parameters and accounts, enabling the same pipeline to operate across all environments without modification. For version control, I maintain backups and track versions with naming conventions for proper documentation and repository snapshots. Prior to deployment, I validate dependencies and conduct test executions to ensure stability, minimizing configuration drift and securing successful deployments.
SnapLogic supports data transformations primarily through its mapper Snap and expression languages that facilitate complex field mapping. This includes conditional logic, data restructuring, and format conversions. In my projects, I utilize the mapper Snap for most transformations, as it allows for visual mapping of schemas while concurrently supporting advanced logic through expressions. For complex scenarios, I combine the mapper with scripting and aggregate, router, and join Snaps to develop more modular transformation pipelines. This approach maintains transformation processes that are reusable and scalable across integrations.
My advice for anyone considering SnapLogic is to view it as an integration platform rather than merely a tool. Doing so can yield stronger results when teams design pipelines with scalability, modularity, and governance in mind from the onset. Organizations should invest early in defining naming standards, reusable components, parameterization strategies, and monitoring frameworks. SnapLogic can significantly accelerate development, with its real value revealing itself when implemented with architectural discipline rather than for quick, one-off integrations.
As a closing thought about SnapLogic, I would emphasize that it is indeed a powerful integration platform with clear strengths, but it also has defined limits. Its effectiveness comes to the forefront when used properly, and its success heavily relies on implementation discipline rather than solely the tool itself. I have given this review a rating of 9.
Which deployment model are you using for this solution?
Hybrid Cloud
If public cloud, private cloud, or hybrid cloud, which cloud provider do you use?
Fast, Low-Code App Integration with Clear Monitoring and Governance
What do you like best about the product?
I can drag and drop to connect apps and data without needing to code much, which makes getting things done quicker. It works for both real-time and batch jobs, so I don’t have to switch tools for different tasks. The built-in Snaps cover most of the systems I use, so setup is usually straight forward. Monitoring and permissions are clear, so it’s easy to see what’s running and keep things organized. Downsides for me: really complex logic can get tricky, and pricing can add up as projects grow.
What do you dislike about the product?
Complex transformations can get messy in the visual canvas, and sometimes I end up writing scripts anyway. Debugging isn’t always intuitive—tracking down why a pipeline failed can take longer than I expect. Costs can creep up as you add more projects, environments, and connectors. Some connectors feel opinionated or lag behind new SaaS features, so I occasionally hit limitations. Governance helps, but without strict standards pipelines can sprawl and become hard to maintain.
What problems is the product solving and how is that benefiting you?
It connects my SaaS apps, data warehouse, and on‑prem systems in one place, so data moves reliably without manual exports. Real-time and batch pipelines keep reports and dashboards up to date, which speeds up decisions. Reusable templates and Snaps cut development time, so I deliver integrations in days instead of weeks. Centralized monitoring and alerts reduce break/fix time and help me meet SLAs. Governance and access controls let multiple teams collaborate without stepping on each other’s work, improving overall quality.
Snap the Logic
What do you like best about the product?
It Helps inter connect data source platforms with each other and can help in accessing different platforms using metrics such as API , MCP etc. It has helped us a lot in delivery of integral data .
What do you dislike about the product?
The AI integration does't work well with bind variables in each snap as it automatically turns them on and off which is not something I like and always need to refine the effort again and again .
What problems is the product solving and how is that benefiting you?
It helping to analyse the data after it has passed through clean sources of api integration and also making sure that data is consistent enough for business production .
Rekiable, Scalable integrations with no/low code
What do you like best about the product?
What I like the most is that how quickly you can build and scale integrations without writing a lot of custom code. The pipelines are intuitive, especially for common use cases like Saas Integrations, API based data movement, and loading data int0 cloud warehouses like Snowflake.
It reduces dev time and makes it much easier to maintain integrations.
What do you dislike about the product?
While Snaplogic ins powerful, some times debugging complex pipelines can be challenging, especially whenn working with nested snaps or large workflows. The UI can feel a bit cluttered for very large integrations.
What problems is the product solving and how is that benefiting you?
It solves the problem of integration data across multiple systems quickly and reliably. It helps us streamline our data ingestion from multiple sources into our data warehouse and reduce manual scripting and maintenance effort.
As a result out data pipelines became more consistent , easier to monitor and faster to adapt when new sources were introduced.
Daily sales pipelines have delivered accurate reports and reduce manual monitoring
What is our primary use case?
My main use case for SnapLogic is to use the monitor tab to monitor the daily pipelines, the sales reports, and to check on the failures which are happening. When some failures occur, I utilize SnapLogic Designer to make the changes so that the pipeline runs successfully.
A specific example of a pipeline I've worked on recently is the daily sales report which we deliver to the business during the early business hours. It is a critical report, and for that file, we faced the issue of duplicates. Together with the help of my senior teammates and colleagues, we created a pipeline that removes all the duplicate entries. It was fascinating because SnapLogic has inbuilt Snaplets that we can directly drag and drop and develop a pipeline with very minimal code. Moreover, we can provide different kinds of parameters that can be versatile and changed according to our requirements. This is one of the pipelines I've worked on recently, about one or two weeks ago, and I monitor it daily after deploying it into the prod environment.
Most of the pipelines I work on are related to building up to loading the data into sales. In my current project, the data from SQL Server and Redshift is loaded into BigQuery and Oracle tables. In SnapLogic, we developed some transformations, such as removing duplicates and adding other parameters to remove inconsistencies. These are the types of pipelines we use in SnapLogic to transition data, functioning as ETL, by extracting from SQL Server or Redshift, transforming in SnapLogic, and loading it back into GCP and Oracle.
What is most valuable?
One of the best features SnapLogic offers is its user-friendly interface. I was not trained in using SnapLogic; I was trained in Informatica, which is also a data engineering tool. However, when I first started using SnapLogic, any person with minimal knowledge in data engineering can easily understand how SnapLogic works because it is clearly segregated into Monitor tab, Designer tab, and a Manager tab. The role assigned in your project defines which tab you will use the most. The user interface is important, and the inbuilt Snaplets that we can directly drag and drop to create different kinds of pipelines also stand out.
The drag and drop functionality and the tabs make my daily work easier compared to my experience with Informatica. In Informatica during my internship, creating mappings took a lot of time and effort since we needed to leverage Mapplets. While we can create one mapping and use it multiple times, there is no help from the application side. There are some transformations most users need that are not inbuilt in Informatica. In contrast, SnapLogic provides inbuilt Snaplets, such as creating and closing an audit ID, removing duplicates, joining tables, writing to Oracle, files, XML, SF, SMTP connections, and more. These Snaplets can be directly used rather than developed in every pipeline.
Since moving to SnapLogic, we have observed fewer errors compared to other tools, such as connection failures or network errors. We have also developed pipelines in SnapLogic capable of notifying us about long-running processes. For instance, if a pipeline that typically takes one hour runs for more than two hours, it automatically triggers an email alerting us to focus on that specific pipeline. This feature significantly reduces the need for continuous human monitoring, demonstrating how SnapLogic truly assists us.
What needs improvement?
I have only been using SnapLogic for about a year, around eight months, so I am unsure of the required changes. However, I believe that since most tools are now integrating with AI to assist new users, incorporating an AI tool could be beneficial.
Sometimes there are glitches when moving a design pipeline from the development to production environment.
For how long have I used the solution?
I have only been using SnapLogic for about seven months.
What do I think about the stability of the solution?
SnapLogic is very stable.
What do I think about the scalability of the solution?
SnapLogic is very scalable, and it can be adjusted based on our requirements, considering the organization type and the data it produces.
How are customer service and support?
I have not used customer support directly, but I have utilized the integrated helping websites, which have been very helpful when I first started using SnapLogic.
How would you rate customer service and support?
Which solution did I use previously and why did I switch?
I learned from my colleagues that two years ago, all our transformations were carried out in Informatica. Because of the inadequate infrastructure in Informatica, our team suggested transitioning to SnapLogic, which the client accepted. After comparing workflows in Informatica and pipelines in SnapLogic, we found that the number of errors due to network or connection issues is much rarer in SnapLogic compared to other tools. The decision to transition to SnapLogic was very beneficial, as it helped the client build a secure and robust environment for their data.
We transitioned to SnapLogic due to the lack of proper infrastructure in the previous tool.
Which other solutions did I evaluate?
My seniors evaluated different options, including SnapLogic, improving the infrastructure in Informatica PowerCenter, and using PySpark, but they ultimately chose SnapLogic over the other options.
What other advice do I have?
My advice for others looking into using SnapLogic is that it is really easy to use. Anyone with minimal knowledge in data engineering or ETL can easily transition into a SnapLogic developer or start using SnapLogic. I have mostly covered all the topics concerning SnapLogic that I appreciate, and I have provided some suggestions that I hope would facilitate ease of use when integrated. I gave SnapLogic a rating of eight out of ten.
Migration projects have accelerated data processing and now require better latency and support
What is our primary use case?
My main use case for SnapLogic is migrations from Boomi and other SQL databases to SnapLogic. Regarding the SQL database, I'm not going to disclose the customer name, but I can give you some highlights. It was financial data for a Fintech organization. They wanted to move from their PNL and GL data. All these migrations and the benefits data, which they have to send to the third-party system, moved from another iPaaS system to SnapLogic because of delays and latency, which was the biggest issue for them. So, we built up plenty of data lakes and migrated the data back to SnapLogic.
Similarly, for the SQL database, it wasn't only the latency, but they were also facing issues where people were unnecessarily writing a lot of triggers. They didn't think that was the correct way because there was no alignment or streamlining in terms of process design or the reusable storage of technical resources. They wanted to identify and streamline the entire process, as well as build some kind of reusable process. I have also found that SnapLogic, or perhaps Boomi or MuleSoft, whatever the customer is going to choose, has some kind of edge in terms of processing data quickly and with a reduced amount of latency. I don't know what changes they are going to do with the help of agent AI, all the pipelines, and DataBricks, and all these things, and how they are going to put the data in perspective within the application. That is the use case I can tell you.
What is most valuable?
The best features SnapLogic offers include the different integration patterns you can quickly adapt. Manipulation is easy with the data. When I'm saying easy, I mean that there are a lot of ways to get different data sources into one place, and then you can do a lot of customization. It is easy as well. For a person who doesn't have that kind of exposure, they can quickly adapt. There is no space to say, 'I don't know this system.' If your basics are clear, then you can quickly adapt. So, it is a quick adoption from SnapLogic's side. The second thing is that the kind of features they are providing for the different integration patterns is really unique.
You can build a lot of quick JMS connections. There are inbuilt connections for the likes of SuccessFactors, Salesforce, and Oracle where you can get the data from the ERPs. Then you can do the data manipulation and map the data to the destination system, and send the data to SFTP, to an S3 bucket, or via the API. You can build the API as well within that suite, and then manipulate the data over there. Accordingly, you can utilize the RBAC system in a better manner.
SnapLogic is coming with a lower latency rate when it comes to a huge number of pipelines and the huge terabytes of data they need to read and then they need to write. Especially if they can look towards Apache Iceberg or DataBricks or anything Spark, in terms of how these systems handle the kind of data they are going to send. So there is a quick turnaround to processing the data for any of the downstream systems.
I believe that SnapLogic impacts the organizations I work with as a contractor. It is really useful for them, especially for new use cases that are quickly adapted with a quick turnaround. It is very useful in terms of testing and realization. Especially when there are unique features in SnapLogic, the entire chain and transport management system for sending the whole config from development to quality and from quality to production is really fantastic. This gives a different kind of aspect to the customer as well, to quickly make changes in development with a quick turnaround. In no time, they can send the data from dev to production.
What needs improvement?
The latency is the biggest issue across iPaaS. That is the important part. I have worked not only with SnapLogic, but also with MuleSoft, Dell Boomi, and Jitterbit. They are all fine, but they have different kinds of functionality. They all have similar kinds of problems in different domains. But as I mentioned, SnapLogic has a little bit of an edge because of its functionality and inbuilt functions for Fintech, as compared to the others. This I can say firmly.
They need to assess themselves. In this day and age, as I mentioned earlier, terabytes of data need to be read and then have a quick turnaround for downstream systems, especially for GenAI or any LLMs. Then they can definitely improve themselves because right now, most things, in fact, the GenAI, ChatGPT, Cloud, Anthropic, and so many others, require data quality with perfection and more precision. But for all of that, we require a data pipeline that can be read without latency and without any delay, for any reason. So if they can improvise that over the cloud, that would be really fantastic and a really good achievement for them. Not only for them, but for the customer as well. Then no matter what, people cannot leave SnapLogic. They need to be there with the snaps.
I don't know much about that. I haven't referred to the documentation that much. But support is something that is pretty obviously required, rather than just providing videos. Technical support is required. The roadmap also needs to be very clearly mentioned and specified. Be specific in which domain they are going to do what, if they are coming out with that roadmap. Otherwise, overall, if they are going to improve their entire system as I mentioned earlier, for the reusability concept and the data pipeline concept, then they will definitely do some magic in the future.
From the HR point of view, or for HR tech, improvement is required. A couple of connectors are not working with all the relevant APIs, and there is always a restriction in terms of fetching the data. So that is why I chose six. From the Fintech point of view, if you are asking on a scale of one to 10, then I would give it an eight out of 10. It is a huge one. There is always a margin for improvement, so that is why I chose eight. If you talk about HR, sales, or any other domain, there is a significant amount of improvement required.
For how long have I used the solution?
I have been working in my current field for close to 18 years.
What do I think about the scalability of the solution?
SnapLogic's scalability is huge. There is a huge amount of scalability. That is why I put it on the scale of six. There is an area for improvement. Once that area of improvement is already done, or about to be completed, then it will definitely be a nine out of 10.
How are customer service and support?
Customer support for SnapLogic is neither bad nor good. It is okay, normal. Sometimes it is very good, sometimes there is no response.
How would you rate customer service and support?
Which solution did I use previously and why did I switch?
I did a lot with Boomi. I love Boomi a lot, along with SnapLogic. Where we switched to SnapLogic, it wasn't from Boomi. The customer chose SnapLogic over the traditional way of working, such as a SQL database, Oracle database, or some other iPaaS solution, which was not Boomi. They switched because the traditional solutions did not fit their commitment or approach to making an ecosystem out of the cloud. That is why they switched to SnapLogic. Everywhere I have worked where SnapLogic is used, they were all in Fintech. Fintech has huge, secured, and very cumbersome numeric logic. They don't have a huge population within the organization, but they require very complex calculations which they would to complete easily. This allows them to connect with it over many years. So if they would want to switch from or to SnapLogic, they need the entire ecosystem as it is: process optimization, approach, documentation, etc. That is why they chose SnapLogic.
I have evaluated other options. Honestly speaking, I gave the suggestion for both Boomi and MuleSoft.
What about the implementation team?
I am a contractor, so I can work. I know this. I did not work with a partner, reseller, or any kind of relationship for anyone.
What was our ROI?
Time was definitely saved with SnapLogic. Money, I don't know. I'm not part of the pricing and setup, so I can't say. I can't comment on that. It is not that fewer employees were needed, but time was definitely saved, and our process was optimized with the help of SnapLogic. That I can say for sure.
Which other solutions did I evaluate?
Before using SnapLogic, I had a different opinion.
What other advice do I have?
I would rate SnapLogic a six on a scale of one to ten.
I don't really remember if you are talking about metrics, but I can give you one example where we built around eighty odd interfaces out of three hundred interfaces that needed to be migrated from Oracle. It worked really well. It was fantastic in terms of processing, latency, data manipulation, and the downstream system. For one interface, we had to send it to more than thirty places in different time zones with different scheduling. We heavily customized those interfaces, and it really works well.
I would tell others looking into using SnapLogic to use it. Take it, use it, and use it. You can't go beyond SnapLogic. SnapLogic is not going to give you any kind of bad experience, definitely. Because any solution that has a huge amount of scalability, there is a probability that a solution SnapLogic is not going to feel uncomfortable for any customer expectations.
Good luck. Do great things and have great achievements. I would love to see and then move myself also onto SnapLogic's new roadmaps. You will definitely do some wonders with SnapLogic.
Which deployment model are you using for this solution?
Hybrid Cloud
If public cloud, private cloud, or hybrid cloud, which cloud provider do you use?
User-Friendly, Efficient Yet Needs Improved Connectivity
What do you like best about the product?
I basically like the ease of use and the very user-friendly UI that helps any user to start integrating and building pipelines and integration very easily from scratch. The connectivity is very fast, and it has a lot of snaps available to connect with multiple source systems and process the data. The user-friendly UI, especially the drag-and-drop feature, is very helpful for creating complex pipelines. The initial setup was very easy, and SnapLogic Intelligent Integration Platform documentation helps a lot whether it's on cloud or on-premises.
What do you dislike about the product?
Basically, when we are processing a larger dataset, it tends to take a lot of time if the nodes are not configured properly. The nodes might go into one particular node, leading to a lot of execution time. Also, while it is running for a longer time, there's a chance of losing connectivity between the external systems and SnapLogic.
What problems is the product solving and how is that benefiting you?
SnapLogic Intelligent Integration Platform (IIP) eases integration with multiple systems, enhances connectivity across environments, and manages large data sets with ease.
Easy Drag-and-Drop Integrations with SnapLogic IIP—Fast, Scalable, and Reliable
What do you like best about the product?
SnapLogic IIP stands out for its ease of use and ease of integration, making it simple to connect multiple applications and data sources. The ease of implementation is a big advantage, as integrations can be built quickly using prebuilt Snaps with minimal coding. It offers a good number of features that support both simple and complex integration needs. We use it frequently because it helps speed up development and reduces manual effort. Overall, it is a reliable platform for enterprise integrations.
What do you dislike about the product?
Although SnapLogic is easy to use once familiar, the initial learning curve can impact the ease of implementation for new users. Debugging and error handling could be improved to enhance the overall ease of use. The customer support response time can sometimes be slow for complex issues. Additionally, licensing costs feel high considering the frequency of use, and better optimization options would help when working with large data volumes.
What problems is the product solving and how is that benefiting you?
SnapLogic IIP helps us solve challenges related to connecting multiple systems and data sources across the organization. We use it as an iPaaS solution for ETL tools, API management, and on-premise data integration, which reduces manual effort and integration complexity. It also supports big data integration platforms and AI agent builders, enabling faster data movement and automation. This improves data consistency, speeds up integration delivery, and allows teams to focus more on business logic rather than technical integration issues.
Powerful Yet Simple Integration—AI Recommendations, Visual Design, and Pre-Built Snaps
What do you like best about the product?
What I like best about SnapLogic IIP is its balance of power and simplicity. The Iris AI provides intelligent recommendations that speed up development, while the visual designer makes complex orchestrations transparent. Its elastic architecture and massive library of pre-built Snaps allow our team to integrate any application or data source in a fraction of the time compared to traditional ETL tools.
What do you dislike about the product?
The primary areas for improvement are the high cost of entry and the complexity of the DevOps/CI/CD lifecycle. While the UI is great for building, the debugging tools for complex transformations could be more granular, and the browser-based Designer can experience performance lag when handling very large pipelines. Additionally, a more standardized expression language or better documentation for syntax quirks would reduce development friction.
What problems is the product solving and how is that benefiting you?
The Problem: Most modern enterprises struggle with data trapped in disparate SaaS applications (Salesforce, Workday, ServiceNow) and legacy on-premise databases that cannot talk to each other.
How it benefits me: It acts as a universal translator. By using pre-built "Snaps," we can connect these systems in hours rather than weeks, ensuring that our customer data is consistent across every department without manual data entry.
Beginner-Friendly with Powerful Integrations
What do you like best about the product?
I use SnapLogic Intelligent Integration Platform (IIP) every day to create pipelines for various purposes. What I like most is its user-friendly drag and drop approach, which keeps the platform clutter-free by separating management and debugging into separate sections like Monitor and Manager. The platform’s ease of use is also enhanced by its ability to integrate with various third-party tools like Git, Tidal, and AWS, which makes it more powerful. Additionally, SnapLogic's low code/no code nature eliminates the dependency on coding knowledge, making it beginner-friendly. I also appreciate the advanced features that SnapLogic offers, including AI, allowing us to create our own AI agents. The initial setup was quite easy, and we received good support from the SnapLogic support team.
What do you dislike about the product?
In my opinion, SnapLogic is beginner-friendly, but when implementing advanced mappings in the mapper or using advanced or complex expressions, at least JavaScript knowledge is required. While generating an AI agent and implementing it through SnapLogic, there is a dependency on other systems like Streamlit, Python, MongoDB, etc. If that can be done in SnapLogic itself, it would reduce the dependency. Also, regarding enhancing user experience, SnapLogic should improve their certification content as many features are still missing, and due to that, many users are unaware of them.
What problems is the product solving and how is that benefiting you?
I use SnapLogic Intelligent Integration Platform (IIP) because it's beginner-friendly, allowing anyone to create pipelines without coding skills. It offers advanced features and AI integration, which makes it a powerful tool for my needs.