By Admin @November, 2 2023
Happy New Month!
The WebAutomation.io team is constantly updating our products and services, deploying new features and fixes while improving our user's experience.
We are all about making web scraping a lot accessible with ease. Now, with the Usage Reporting feature, we're taking it a step further by providing you with real-time insights into your usage – be it the number of rows processed or the credits consumed. Here’s how it works:
Our Usage Reporting is an innovative feature that lets you track and visualise your usage metrics, providing a comprehensive overview of your activity on our dashboard.
Data-Driven Decision Making: With Usage Reporting, you can make informed decisions from your utilised web scrapers.
Transparency: Usage Reporting enhances transparency and accountability within your organisation. It's an excellent tool for collaboration.
Resource Allocation: You can allocate additional resources to high-priority projects or make necessary adjustments to maximise your productivity.
Performance Optimization: By keeping a watchful eye on your rows processed, you can gauge the efficiency of your automation workflows. Use this data to identify bottlenecks and streamline your processes for better performance.
Visual Clarity: The usage metrics are presented in an intuitive, visually appealing format, making it easy for users to understand their data at a glance. You won't need to sift through complex reports – it's all there in front of you.
For easy accessibility, we have created a guide to walk you through how to access and utilise the sessions and API reports in your dashboard, in 3 easy steps.
Would you like a step-by-step guide on how to view sessions and API reports using our latest updates? Learn more here
New Feature - Dashboard ( session and API)
We have also added a new feature on our API and session dashboard, providing holistic solutions for your automation projects. You can now set up and configure sessions and APIs in a matter of minutes. Specify your session or API details, including user agents, locations, and more, with just a few clicks.
We have added a streamlined interface for managing sessions, making it easy to create, configure, and monitor your automation sessions.
New Blog Articles
Here is a list of new or updated PDE's added recently
Our predefined web extractors (PDE) are designed specially to help you gather all the data you need, with just a click of a button, without having to write any code!
And the best part? Our web scrapers are easy to use and FREE to try!
LinkedIn Jobs Scraper | Pre-Built, No Code Required: You can now extract job listings from LinkedIn effortlessly. No coding needed. Download data in XLS, CSV, or JSON formats within minutes
Extract job posting from SEEK Australia: Similar to LinkedIn, you can extract Job listings data, quickly and easily, from numerous listings without having to write any code.
Got any suggestions? We love to hear from you!
We firmly believe that the customer is king and we believe in growing along with our customers and their needs. As you know we have already built features based on customers' requests. Email us your thoughts anytime at firstname.lastname@example.org