Job Title | Budget | ||||
---|---|---|---|---|---|
Receipt Data Extraction and Analysis
|
25 - 50 USD
/ hr
|
16 hours ago |
Client Rank
- Medium
$200 total spent
2 hires
5 jobs posted
40% hire rate,
1 open job
5.00
of 1 reviews
Industry: Fashion & Beauty
Company size: 2
Registered: Apr 22, 2024
Toronto
8:01 AM
3
|
||
I'm looking for a freelancer to help me extract data from receipts. I have hundreds of these as pictures (JPGs), PDFs, and email attachments.
Typing all the information into a spreadsheet by hand is taking way too much time. I need someone to build an automatic system that can "read" these receipts for me and organize the information. What I need the system to do: Look at a receipt file (image or PDF) and automatically pull out the key information. Put all of this information into a simple spreadsheet (like Google Sheets or Excel). It needs to be pretty accurate. If the system can't read a receipt properly, it should just flag it so I can look at it myself. What you should be good at: You have experience setting up systems that can automatically read text from images and PDFs. You know the best tools for this job and can recommend the right approach. You can explain what you're doing in simple terms. If done well, there is an opportunity for a future project to build an AI interface on top where I can just ask questions about the receipts and AI will get the answers to analyze all this spending data. How to Apply: In your proposal, please briefly explain in simple terms how you would approach this. Please share any examples of similar automation or data-reading projects you've done before.
Hourly rate:
25 - 50 USD
16 hours ago
|
|||||
Data Scraping Specialist for Scientific Website
|
not specified | 16 hours ago |
Client Rank
- Risky
1 open job
Registered: Jul 15, 2025
2:01 PM
1
|
||
We are looking for a skilled data scraper to gather information from a scientific website. The ideal candidate will have experience in web scraping and data extraction techniques. Your task will involve collecting specific datasets and ensuring the information is organized and accurate. Attention to detail is crucial for this project, as well as familiarity with scientific data formats. If you have a passion for data and a strong background in web scraping, we would love to hear from you. Details will be discussed in person.
**Relevant Skills:** - Web Scraping - Data Extraction - Python / R / JavaScript - Data Cleaning and Formatting - Familiarity with APIs (if applicable) - Attention to Detail
Budget:
not specified
16 hours ago
|
|||||
App for Document Data Extraction
|
30 - 250 USD | 15 hours ago |
Client Rank
- Risky
1 open job
Registered: Mar 3, 2025
1
|
||
I'm looking for a skilled developer to create an app that can extract specific text data from PDF documents and organize it into an Excel sheet. The app should be efficient and accurate in processing various document types to ensure reliable data extraction.
Key Requirements: - Develop an app to process PDF documents - Extract text data accurately from the documents - Organize extracted data into an Excel sheet - Ensure the app is user-friendly and efficient Ideal Skills and Experience: - Strong experience in app development, particularly with data extraction - Proficiency in handling PDF files and text recognition technologies - Familiarity with Excel and data organization - Attention to detail to ensure data accuracy and reliability I'm eager to collaborate with a developer who can deliver a robust solution for my data extraction needs. Skills: Data Processing, Data Entry, Mobile App Development, Excel, Android, PDF, Data Extraction, App Development
Fixed budget:
30 - 250 USD
15 hours ago
|
|||||
Expert Data Scraper Required
|
2 - 8 USD
/ hr
|
15 hours ago |
Client Rank
- Excellent
$30 909 total spent
26 hires, 2 active
1 open job
5.00
of 19 reviews
Registered: Feb 8, 2022
5
|
||
I am in need of an expert data scraper to extract text data from various sources. The data will be gathered from a range of random websites, so flexibility and adaptability are key.
Key Requirements: - Ability to scrape text data from multiple and diverse websites - Experience with different data extraction techniques - Proficiency in handling data from varied platforms - Ensure data accuracy and integrity during the scraping process Ideal Skills and Experience: - Strong background in web scraping and data extraction - Familiarity with web scraping tools and technologies - Problem-solving skills to handle different website structures - Experience with data cleaning and organization If you have the expertise to efficiently gather and process data from a variety of sources, I look forward to your bid. Skills: PHP, Excel, Web Scraping, Data Mining, Data Scraping, Data Extraction, Data Analysis, Data Management
Hourly rate:
2 - 8 USD
15 hours ago
|
|||||
Financial Model
|
not specified | 15 hours ago |
Client Rank
- Risky
1 open job
Registered: Apr 7, 2024
Accra
12:01 PM
1
|
||
Financial Modeler Needed for Early-Stage Startup & Advisory on Parallel Project
1. Startup Financial Model Build a 3–5 year financial forecast model for investor conversations Structure assumptions for customer acquisition, pricing, costs, burn rate, and revenue Support cap table structuring, founder equity modeling, and investor dilution planning Deliver editable Excel model with summary dashboards and scenarios 2. Parallel Project Support Provide light financial modeling or analysis for a related project climate tech Possibly assist with basic pitch deck financials or a grant/business plan section ✅ Ideal Candidate Strong background in early-stage startup modeling Fluent with Excel or Google Sheets, and comfortable making assumptions transparent Prior experience with cap tables, SAFE notes, and equity planning is a big plus Bonus: background in engineering/finance, or has worked with technical founders Client's questions:
Budget:
not specified
15 hours ago
|
|||||
Fix Amazon Scraper Script -- 3
|
10 - 30 USD | 14 hours ago |
Client Rank
- Excellent
$66 199 total spent
62 hires
3 open job
5.00
of 7 reviews
Registered: Apr 4, 2018
5
|
||
I'm looking for a developer to fix an existing scraping script. The script currently returns "N/A" for 90% of the values, and I need this issue resolved. I need them all to say "yes" , or "no" if there is free returns in the listing or not. I believe the issue is that the proxy server isn't sending US ip addresses, when it should be doing that already( I could be wrong that, thats the issue). I need you to into my machine and fix the problem.
Key Requirements: - PHP , java, C languages - Experience with web scraping - Familiarity with proxy services Ideal Skills: - Strong debugging skills - Knowledge of Amazon's HTML structure - Previous experience with scraping tools and libraries Skills: PHP, JavaScript, Web Scraping, MySQL, HTML, Debugging, Data Extraction, API Development
Fixed budget:
10 - 30 USD
14 hours ago
|
|||||
Python Web Scraper Development
|
10 - 20 USD
/ hr
|
15 hours ago |
Client Rank
- Medium
6 jobs posted
1 open job
Registered: Jan 30, 2021
Okara
5:01 PM
3
|
||
We're looking for a skilled Python developer to create a web scraper tailored to our specific needs. The ideal candidate will have experience in data extraction, web automation, and handling various web technologies. You will be responsible for developing a robust scraping solution that can handle dynamic content and ensure data accuracy. If you have a passion for data manipulation and can work efficiently, we would love to hear from you!
Client's questions:
Hourly rate:
10 - 20 USD
15 hours ago
|
|||||
Geospatial Weather Project
|
100 - 160 USD
/ hr
|
11 hours ago |
Client Rank
- Medium
1 open job
Registered: Jul 18, 2025
7:01 AM
3
|
||
We are seeking a skilled freelancer to assist in mapping spaced-out addresses by their zip codes, utilizing Verisk's weather API to retrieve relevant weather data. The ideal candidate will compile and present the results in a clear and organized manner. This project requires attention to detail and proficiency in data mapping and API integration. Experience with geolocation services and data analysis is a plus. If you have a knack for managing and interpreting location-based data, apply now!
Hourly rate:
100 - 160 USD
11 hours ago
|
|||||
Web Scraper Developer for Tax Auction Property Data (Real Estate Investing)
|
25 - 35 USD
/ hr
|
11 hours ago |
Client Rank
- Good
$1 878 total spent
6 hires, 4 active
7 jobs posted
86% hire rate,
2 open job
5.87 /hr avg hourly rate paid
300 hours paid
5.00
of 6 reviews
Industry: Real Estate
Individual client
Registered: May 2, 2024
5:01 AM
4
|
||
I’m looking for an experienced web scraping developer to build an automated script that collects and enriches property data from Florida tax deed auctions, specifically from RealTaxDeed.com.
The script will: -Scrape auction property data (parcel number, address, opening bid, auction date, etc.) from RealTaxDeed.com -Enrich each record with additional property information by pulling: Photos & lot maps from county GIS websites Street view & satellite images from Google Maps FEMA flood map results Basic property details from ID.land (if accessible programmatically) Automatically update a Google Sheet or Airtable where my existing formulas will score investment potential The process should be fully automated with minimal manual input — ideally, the script runs on a set schedule (daily or weekly) and continuously updates the database. Requirements: Expertise in web scraping (Python, Selenium, Puppeteer, BeautifulSoup, or similar) Experience with Google Sheets or Airtable API integration Ability to handle CAPTCHAs, login sessions, and rotating proxies if needed STRONG ENGLISH COMMUNICATIONS SKILLS (clear updates and documentation are critical) Experience with real estate or auction data scraping is a plus Deliverables: A fully functional script that runs automatically and updates the Google Sheet Clean, well-documented code so future developers can maintain or expand it Instructions for running and scheduling the script Bonus (not required, but nice to have): Ability to add basic deal-scoring logic later based on criteria (e.g., utility availability, assessed value %). To Apply, Please Answer: Briefly describe your experience scraping real estate or auction sites. Which technology stack would you recommend for this? What’s your estimated turnaround time for a first working version? Budget: Open to proposals, but seeking an experienced developer who can deliver a reliable, production-ready solution. Ongoing Work: If this goes well, I’ll hire long-term for additional tax auction and land bank sites nationwide. Client's questions:
Hourly rate:
25 - 35 USD
11 hours ago
|
|||||
Airtable CSV Upload + Make.com Automation
|
30 USD | 11 hours ago |
Client Rank
- Excellent
$24 061 total spent
33 hires, 10 active
30 jobs posted
100% hire rate,
1 open job
5.64 /hr avg hourly rate paid
2 588 hours paid
4.99
of 20 reviews
Industry: Real Estate
Company size: 2
Registered: Feb 23, 2017
Richmond
9:01 AM
5
|
||
Need help modifying Airtable interface for uploading CSV files and a Make.com automation to process them. Need ASAP - 24 to 48 hours maximum delivery.
What I need: - Form interface with dropdown selection and file upload fields - Conditional field visibility (show different upload fields based on dropdown selection) - Make.com automation to extract data from uploaded CSV files and create records - Simple table view to edit the extracted data Requirements: - Experience with Airtable interfaces - Basic Make.com automation skills - CSV file processing knowledge The project involves 3 different file providers with 5 total file formats that need specific data extraction. I’ll provide detailed specs and sample files to the selected freelancer. The file format is very simple. I am just looking to pull data from one column from each upload that’s it. Budget: $30-40 USD MAX Timeline: 24-48 hour delivery maximum
Fixed budget:
30 USD
11 hours ago
|
|||||
Data Scraping Expert Needed for Large Data Sets
|
4 - 6 USD
/ hr
|
10 hours ago |
Client Rank
- Medium
$460 total spent
2 hires, 3 active
5 jobs posted
40% hire rate,
3 open job
6.17 /hr avg hourly rate paid
55 hours paid
Registered: Mar 22, 2025
Schofields
10:01 PM
3
|
||
We are seeking a skilled data scraping expert to help us extract large volumes of data using Python or other suitable tools. The ideal candidate will have experience in web scraping and familiarity with data processing techniques. You should be able to work efficiently and ensure data accuracy. If you are detail-oriented and have a passion for data extraction, we want to hear from you!
Client's questions:
Hourly rate:
4 - 6 USD
10 hours ago
|
|||||
Build Scraper tool for Subscription Offers on Best Buy, Walmart, Target (potential ongoing)
|
500 USD | 10 hours ago |
Client Rank
- Medium
2 jobs posted
1 open job
Registered: May 28, 2025
5:01 AM
3
|
||
Job Title:
Build Scalable Web Scraper for Subscription Offers on Best Buy, Walmart, and Target (Fixed Price) Project Type: One-time Project – Fixed Price (but with potential ongoing work) Job Description: We are seeking an experienced web scraping specialist to build a scalable and self-serve data extraction tool for tracking subscription service offers (e.g., YouTube Premium, Apple TV+, SeriusXM, Xbox Game Pass, Fubo TV, etc) and product reviews from the following U.S. retail websites: BestBuy.com Walmart.com Target.com Deliverables: 1) A working scraping tool that: - Crawls thousands of product detail pages (PDPs) - Extracts Subscription services offered with eligible products - Number of customer reviews - Average score / star rating - Adapts to each retailer’s specific PDP layout 2) Output format: - Structured Excel or CSV file - Must match (or be adapted from) a sample Excel template (to be reviewed together on a short video call) 3) Anti-blocking measures: - Include proxy management, IP rotation, CAPTCHA bypass (if needed), and ability to render dynamic content. 4) Self-serve setup: - Tool must run locally from my local machine (MacBook with M3 processor) - Should not require deep technical knowledge to operate monthly 5) Short documentation or video walkthrough of: - How to install/setup the tool - How to use it to generate the monthly report
Fixed budget:
500 USD
10 hours ago
|
|||||
Data Scrape for email addresses and lead generation for realtors
|
50 USD | 11 hours ago |
Client Rank
- Risky
1 open job
Industry: Real Estate
Individual client
7:01 AM
1
|
||
Need to scrape the email addresses of realtors from an agency website (~1100) and also if you can help with lead generation for realtors operating in the Houston area, that would be helpful.
Fixed budget:
50 USD
11 hours ago
|
|||||
Denmark Restaurant Lead Research + Email & Phone Extraction
|
4 - 7 USD
/ hr
|
11 hours ago |
Client Rank
- Medium
1 jobs posted
1 open job
Registered: Jul 31, 2022
9:01 AM
3
|
||
I'm looking for a virtual assistant to help identify and collect restaurant leads in Denmark. You'll be working with public Danish business databases and using email scraping tools to find accurate contact info for outreach.
This is not copy-paste work — I need someone thoughtful and detail-oriented who can follow a lead generation process. Your Tasks Will Include: Search restaurants using Google Maps Look up each business in CVR.dk / Virk.dk to: Get the owner’s full name Confirm company registration Use tools to find verified work emails based on: Owner name Company domain Cross-check LinkedIn when needed to ensure accuracy Enter verified data into a Google Sheet with: Restaurant name Owner name City Phone Domain Notes (LinkedIn, email source, confidence) Tools We Use: Google Maps CVR.dk / Virk.dk AnyMailFinder, Clay (etc) Google Sheets Requirements: Strong attention to detail Able to navigate CVR.dk / Virk.dk Comfortable using email lookup tools Fluent in English (basic Danish is a plus) Prior experience with lead generation or VA work To apply, please: Share your experience with email finding or lead generation Confirm you’re familiar (or willing to learn) CVR/Virk.dk List any tools you’ve used before (Hunter, Snov, LinkedIn, etc.) Include your hourly rate and availability
Hourly rate:
4 - 7 USD
11 hours ago
|
|||||
Make.com + Airtable Complex Automation
|
150 USD | 8 hours ago |
Client Rank
- Excellent
$24 061 total spent
34 hires, 10 active
30 jobs posted
100% hire rate,
2 open job
5.64 /hr avg hourly rate paid
2 588 hours paid
4.99
of 20 reviews
Industry: Real Estate
Company size: 2
Registered: Feb 23, 2017
Richmond
9:01 AM
5
|
||
URGENT: Airtable + MultiStep Make.com Automation - Need (3-4 Days)
Budget: $150-250 USD MAX Timeline: 3-4 days maximum ⚠️ **EXPERIENCED PEOPLE ONLY - DON’T APPLY IF YOU’RE STILL LEARNING** ⚠️ I need someone who knows what they’re doing and can get this done fast. No time for training or figuring things out. Project Overview: This is changing a many’s all report upload to 90% report upload automation. Client will upload their sales data from different booking systems, and I create customized reports for them. Need automation to take their raw CSV files, clean the data, populate charts in Google Sheets, generate branded reports in Google Slides, and email the final PDF reports. The slides template are clear vested and already linked. This automation will kick off after a submission button in Airtable. What This Involves: - One Make.com - Build automation that downloads CSV files and creates reports - Handle 3 different file formats - files only have 5-7 columns each - Get data from two links Airtable tables (CSVs, client info, checkboxes) - Use existing mapping table to change service names (mapping table already exists) - Upload data to Google Sheets using Add Rows module - Use Create from Template module for Google Slides and delete unwanted slides - Send final reports via email Technical Work: - 25-30 Make.com modules including Iterator, HTTP Get File, CSV Parse, Airtable Search Records - Download and parse CSV files from Airtable attachments - Use Search Records to find mappings and apply them to data - Use Google Sheets Add Rows to upload processed data - Use Google Slides Create from Template and Delete Slides modules - Router modules for different file types and report formats I’ve already tested multiple functions in this flow like Create from Template, Delete Slides, CSV Parse, and Add Rows to Google Sheets, so I know it can be done. You Need Experience With: - Built Make.com scenarios with 20+ modules before - Used Airtable Search Records with filters and linked tables - CSV processing with Parse CSV module - Google Sheets Add Rows and Update Values modules - Google Slides Create from Template and Delete Slides modules - Router modules with conditional logic and multiple paths Don’t Apply If: - You’ve never used Iterator and Aggregator modules - You’re not familiar with Router modules and conditional logic - You need to learn Google modules during this project - You need guidance on basic Make.com operations Show me your biggest or most complex Make.com automation and confirm you’re available to start immediately.
Fixed budget:
150 USD
8 hours ago
|
|||||
IRS 990 Data Extraction using Python Script
|
100 USD | 7 hours ago |
Client Rank
- Good
$1 560 total spent
9 hires, 3 active
9 jobs posted
100% hire rate,
2 open job
9.79 /hr avg hourly rate paid
69 hours paid
5.00
of 6 reviews
Industry: Real Estate
Individual client
Registered: May 9, 2022
SEATTLE
5:01 AM
4
|
||
📝 Description:
We are seeking a detail-oriented freelancer to run a ready-to-use Python script that extracts board member and tax preparer information from IRS 990 filings. The input is a CSV of ~1,772 nonprofit organizations tied to HUD multifamily properties. Your job is to run the script, troubleshoot any errors, and return a clean CSV with the requested fields. ________________________________________ 🔧 What You'll Do: • Use the provided Python script (990_officer_preparer_extractor_with_HUD_limited.py) • Use the input CSV: Master_990_Research_1771_2025_07_18.csv • Extract from the latest 990 XML filings: o Up to 5 officer/board member names per org o Titles (e.g., President, Chair, Treasurer) o Tax preparer name o Preparer firm o IRS filing URL • Fill out the output CSV exactly as structured: o HUDPropertyID, EIN, officer_name, officer_title, tax_preparer_name, tax_preparer_firm, filing_url • Mark any EIN not found or 990 unavailable in an error column ________________________________________ 📁 What We Provide: • Python script (ready to run) • Sample .env for ProPublica API key • Full README instructions • Input file with 1,772 records ________________________________________ ✅ Requirements: • Proficient with Python 3.9+, pandas, requests, lxml • Able to register a free ProPublica API key (we’ll guide you) • Familiar with working through rate limits or temporary fetch issues • Attention to detail; clean and structured data output is key ________________________________________ 💵 Budget & Timeline: We are open to fixed-price bids. This task typically takes 3–4 hours for a skilled Python freelancer with the script already prepared. Recommended Bid Range: 💰 $75–$125 USD flat rate (Based on similar jobs; we’ll prioritize speed + accuracy.) Please include a sentence confirming that you’ve reviewed the instructions and are ready to begin upon award. ________________________________________ 🔁 Follow-Up Work: This is part of a larger project. If the first batch goes well, we may hire you again for future rounds or additional scraping tasks. 🧾 Job Post Title: IRS 990 Extractor – Python Script to Pull Board Members & Tax Preparers (Preserve All Columns – 1,772 Records) ________________________________________ 📝 Job Description: We are looking for a Python-capable freelancer to run a pre-built IRS 990 extractor script against a dataset of 1,772 nonprofit entities. Your job is to enrich the file with officer and tax preparer information pulled from the most recent IRS 990 XML filings — while keeping all original columns intact. ________________________________________ 🔧 Scope of Work: • Run our provided Python script: 990_officer_preparer_extractor_with_HUD_limited.py • Input: CSV of 1,772 nonprofit HUD-affiliated entities • For each nonprofit: o Use EIN if available; otherwise, use org name + city/state to look up EIN (via ProPublica API) o Download latest available IRS 990 XML filing (2011–2023) o Extract: Up to 5 board members or officers Officer title (e.g., President, Chair, Treasurer, ED) Tax preparer name Preparer firm IRS 990 filing URL • Append extracted fields to the output CSV without deleting or reordering any of the original columns. ________________________________________ 📤 Required Output Format: • Final output must contain all original columns from the input file, plus these new columns: o EIN o officer_name o officer_title o tax_preparer_name o tax_preparer_firm o filing_url o error (if any) • If multiple officers exist, output multiple rows per HUDPropertyID (one per officer), duplicating other values as needed. ________________________________________ 📁 What We Provide: • ✅ Python script (already tested and ready) • ✅ Input CSV: Master_990_Research_1771_2025_07_18.csv • ✅ README and setup instructions • ✅ .env template for ProPublica API key ________________________________________ ✅ Requirements: • Python 3.9+ with pandas, lxml, requests, tqdm • Familiarity with XML parsing • Comfortable registering a free ProPublica API key (1-min process) • Able to maintain strict data cleanliness (no dropped columns or encoding errors) ________________________________________ 💵 Budget: This task should take ~3–4 hours with the provided tools. 💰 Fixed Rate: $100 USD (negotiable based on speed and quality) We'll tip for early, clean delivery. ________________________________________ 🔁 Potential for Repeat Work: This is one part of a larger nonprofit + HUD acquisition project. If this goes smoothly, we may hire you again for similar extraction and enrichment work. ________________________________________ Client's questions:
Fixed budget:
100 USD
7 hours ago
|
|||||
Convert PDF Data (Page 307–426) into Interactive Excel Tool
|
50 USD | 7 hours ago |
Client Rank
- Excellent
$22 245 total spent
66 hires, 15 active
101 jobs posted
65% hire rate,
5 open job
4.96
of 47 reviews
Registered: Feb 18, 2020
kelowna
1:01 PM
5
|
||
Job Description:
We are looking for a detail-oriented Excel expert to extract, structure, and convert data from a PDF (Page 307 to 426) into a well-organized, user-friendly Excel sheet. The goal is to create an Excel file that not only contains accurate table and figure data from the specified pages but also allows us to input values and get dynamic results or references based on those inputs. Scope of Work: Extract tables and figures only from pages 307 to 426 of the provided PDF. Ensure all data is transcribed accurately. Design the Excel file to allow input-based results (e.g., using formulas, lookup functions, dropdowns). Maintain a clean and logical layout, organized by table/figure or section as needed. The final output must be easy to navigate and ready for decision-making or analysis. Deliverables: One Excel file with: All extracted data from the specified pages Interactive inputs (e.g., dropdowns or value cells) Formula-driven outputs based on inputs Well-labeled sheets/tabs for each table/section (if applicable) Error-free formatting and consistent structure Ideal Freelancer: Proven experience with PDF to Excel conversion Strong skills in Excel formulas and data modeling (e.g., VLOOKUP, INDEX/MATCH, conditional logic) Ability to design clean and intuitive spreadsheets Meticulous with data accuracy and formatting Responsive and able to meet deadlines Additional Notes: PDF will be attached once the contract begins. This is a one-time task, but excellent work may lead to future spreadsheet/data projects. To Apply: Please include: A short description of similar projects you've worked on An example of an interactive Excel sheet you’ve created (if available) Your estimated turnaround time for this task We need this task in the next 12 hours time
Fixed budget:
50 USD
7 hours ago
|
|||||
Arabic-Speaking Automation Expert Needed for WhatsApp Job Alert System (n8n + Apify + GPT)
|
not specified | 7 hours ago |
Client Rank
- Medium
$588 total spent
2 hires
6 jobs posted
33% hire rate,
1 open job
24.99 /hr avg hourly rate paid
10 hours paid
5.00
of 2 reviews
Registered: Jun 9, 2025
Riyadh
3:01 PM
3
|
||
Project Overview
We’re building an automated system that scrapes new job postings from specific websites, processes the data, classifies it based on job fields (e.g. Engineering, Marketing, IT), and notifies the right user through a connected messaging platform. 🔧 Scope of Work The ideal freelancer will help us with: Scraping job posts from selected websites (structured and unstructured). Parsing job content and extracting key data (title, location, specialization, etc). Using automation tools (e.g. n8n or similar) to run scheduled workflows. Connecting the automation with a spreadsheet (Google Sheets or Airtable). Filtering new jobs based on pre-defined criteria (for each user). Sending the filtered job via a messaging integration (optional phase). 💡 Bonus Skills (Optional): Familiarity with setting up messaging APIs and automation platforms. Experience in job scraping or classified websites is a plus. --- 📝 In Your Proposal, Please Include: Examples of similar automation or scraping projects you’ve worked on. What tools or stack you'd recommend and why. How you'd approach this project step-by-step.
Budget:
not specified
7 hours ago
|
|||||
Data Extraction from Image-Based PDF
|
~15 - 29 USD
/ hr
|
7 hours ago |
Client Rank
- Risky
1 open job
Registered: May 27, 2025
1
|
||
I need assistance in extracting specific data from a high-quality image-based PDF, word, apps and organizing it into an Excel spreadsheet. The PDF contains multiple entries, and I require the following details to be accurately extracted and placed in separate columns, as per the given formats.
Key Requirements: - Extract data accurately from a high-quality image-based PDF. Organize the data into an Excel file with each field in its respective column. Ensure that each entry is placed in the correct row without any extra or irrelevant data. Handle variations in entry formatting with precision. Retain only standard text, excluding any special characters or symbols. Skills Required Data Processing Data Entry Excel Web Scraping Image Processing Data Cleansing Data Extraction Data Analysis Data Management Ideal Skills and Experience: - Proficiency in data extraction from image-based PDFs. Strong attention to detail to ensure data accuracy and consistency. Experience with Excel for data organization and formatting. Ability to handle inconsistencies in data formatting effectively. Please ensure that the final output is clean, organized, and meets all specified requirements. Skills: Data Entry, Excel, Data Mining, Image Processing, Data Cleansing, Data Extraction, Data Analysis
Hourly rate:
1250 - 2500 INR
7 hours ago
|
|||||
Save Website HTML pages with Cntl-S
|
250 - 750 USD | 6 hours ago |
Client Rank
- Excellent
$89 447 total spent
65 hires, 2 active
3 open job
4.98
of 44 reviews
Registered: Apr 18, 2006
5
|
||
I need assistance in saving HTML pages from a website. The project involves saving 1000s of pages along with all associated assets, including images, CSS, and JavaScript. The saved pages must be organized in specific folders as per my requirements. A simple Cntl-S and the addition of a filepath in front of the saved filename dialog would work. I need the ability to either provide a spreadsheet with a list of links or an api interface that specifies a specific link and save location.
Key Requirements: - Save 1000's of HTML pages. - Include all assets: images, CSS, and JavaScript. - Organize saved pages in specific folders or as described above. Ideal Skills and Experience: - Proficiency in web scraping and data extraction. - Experience in handling and organizing large volumes of web data. - Knowledge of HTML, CSS, and JavaScript. - Ability to structure data in specified formats. Skills: Web Scraping, HTML, API Development, Data Management
Fixed budget:
250 - 750 USD
6 hours ago
|
|||||
Podcast Data Extraction & Problem Categorization
|
75 USD | 6 hours ago |
Client Rank
- Excellent
$129 239 total spent
75 hires, 16 active
74 jobs posted
100% hire rate,
1 open job
26.21 /hr avg hourly rate paid
4 396 hours paid
4.98
of 69 reviews
Industry: Sales & Marketing
Company size: 10
Registered: Dec 17, 2016
Bellevue
9:01 AM
5
|
||
We have a spreadsheet of podcast episodes. For each episode, we need you to:
Extract the guest name and company/organization from the episode description and add to the file. Analyze the problem or issue the episode addresses. Add one column with a short description of the problem discussed. Add another column with a high-level problem category (e.g. labor inequality, product waste, urban finance, health misinformation, etc.). You may use LLM to help with the task, but we expect the final output to be accurate and human-reviewed. Please include a short note on your experience and complete 5 sample rows (from 5 different podcast episodes) from the attached spreadsheet. You can define your own categories, but the goal is to map each episode to a real-world societal problem.
Fixed budget:
75 USD
6 hours ago
|
|||||
Data Extraction from Image-Based PDF
|
~145 - 435 USD | 5 hours ago |
Client Rank
- Risky
1 open job
Registered: Jun 27, 2022
1
|
||
I need assistance in extracting specific data from a high-quality image-based PDF, word, apps and organizing it into an Excel spreadsheet. The PDF contains multiple entries, and I require the following details to be accurately extracted and placed in separate columns, as per the given formats.
Key Requirements: - Extract data accurately from a high-quality image-based PDF. Organize the data into an Excel file with each field in its respective column. Ensure that each entry is placed in the correct row without any extra or irrelevant data. Handle variations in entry formatting with precision. Retain only standard text, excluding any special characters or symbols. Skills Required Data Processing Data Entry Excel Web Scraping Image Processing Data Cleansing Data Extraction Data Analysis Data Management Ideal Skills and Experience: - Proficiency in data extraction from image-based PDFs. Strong attention to detail to ensure data accuracy and consistency. Experience with Excel for data organization and formatting. Ability to handle inconsistencies in data formatting effectively. Please ensure that the final output is clean, organized, and meets all specified requirements. Skills: Data Entry, Excel, Data Mining, Image Processing, Data Cleansing, Data Extraction, Data Analysis
Fixed budget:
12,500 - 37,500 INR
5 hours ago
|
|||||
Developing a Chatbot Based on Google Gemini API
|
250 - 750 USD | 5 hours ago |
Client Rank
- Excellent
$32 149 total spent
20 hires, 2 active
1 open job
4.63
of 2 reviews
Registered: Mar 10, 2014
5
|
||
I want to create an educational chatbot demo using Google Gemini API. We provide online video education using Python, and we want to allow students to ask questions to the chatbot and receive answers while watching the video education. If possible, we want to provide answers from our educational materials (PDF) and Q&A.
Main requirements: - Develop a chatbot based on Google Gemini API - Implement answering function for students' questions - Provide answers using PDF textbook and Q&A data - Provide source code as the final output Required Required skills and experience: - Understanding and utilizing Google Gemini API - Experience developing chatbots - Ability to process PDF and Q&A data - Flexibility for various programming languages PDF textbook and Q&A data will be provided through file upload. We hope to receive a lot of support from those who are interested in this project. Skills: Python, Data Processing, Software Architecture, Data Extraction, Data Analysis, Google APIs, Chatbot, API Development, Natural Language Processing, AI Chatbot Development
Fixed budget:
250 - 750 USD
5 hours ago
|
|||||
Data Extraction from LinkOne Offline Catalog
|
15 - 40 USD
/ hr
|
1 hour ago |
Client Rank
- Medium
$400 total spent
1 hires, 1 active
2 jobs posted
50% hire rate,
1 open job
Industry: Manufacturing & Construction
Company size: 2
Registered: Mar 6, 2025
Tallinn
3:01 PM
3
|
||
We are looking for a technically capable freelancer to extract structured parts data from a LinkOne-based offline parts catalog. The catalog is provided as a folder containing its original internal data files (not PDFs).
Objective: • Analyze the catalog folder and determine how part data is stored • Extract structured part records including: Part Number, Description, Model and Group reference • Upload the extracted data to our Supabase PostgreSQL database What You’ll Receive: • A complete catalog folder (original file structure) • Screenshots of file layout (attached to this post) • Supabase credentials: • Project URL • Public API key • Target table schema Deliverables: • A working tool or script (Python preferred) that: • Parses the catalog folder and extracts usable records • Uploads data to Supabase • One-time output (CSV, SQL, or Supabase insert) • Clear instructions for applying the same process to future catalogs Important Notes: • This is not a PDF or XML job - the catalog is based on a legacy LinkOne format • The file structure may vary - we are open to partial results if full extraction is not feasible • You must explain limitations or challenges clearly if full data cannot be retrieved Ideal Experience: • Experience with legacy or proprietary data structures • Reverse engineering / data mining • Python scripting (text/binary parsing, regex, etc.) • Familiarity with Supabase or Postgres APIs
Hourly rate:
15 - 40 USD
1 hour ago
|
|||||
Pull a database from a website
|
50 USD | 1 hour ago |
Client Rank
- Good
$1 667 total spent
14 hires, 5 active
16 jobs posted
88% hire rate,
2 open job
4.57
of 4 reviews
Industry: Health & Fitness
Individual client
Registered: Oct 12, 2023
ORANGE
10:01 PM
4
|
||
Pull a database from a website - member details. I can see them all individually, but I need them all in a spreadsheet.
A few people have attempted this - and they can only see 60 members when there are over 350. So you will need be very experienced with databases to complete this.
Fixed budget:
50 USD
1 hour ago
|
|||||
Data Entry and data minning
|
~1 - 5 USD
/ hr
|
1 hour ago |
Client Rank
- Risky
1 open job
Registered: Jul 19, 2025
1
|
||
Experienced Data Entry & Data Mining Specialist
We are looking for a highly experienced and detail-oriented individual with a strong background in data entry and data mining. The task involves accurately transferring data from our existing software platform into an Excel spreadsheet and then importing it into another software system. Key Responsibilities: Extract and enter data from the existing software system. Organize the data in a structured Excel sheet format. Ensure that all data fields are accurately filled, without errors or omissions. Import or input the cleaned data into the destination software. Maintain strict accuracy and consistency across all entries. Required Skills: Proven experience in data entry and data mining. Strong knowledge of Microsoft Excel (formulas, formatting, data validation). Familiarity with different types of software platforms and data transfer processes. Ability to work independently and meet deadlines. Excellent attention to detail and data accuracy. Preferred Experience: Handling software-to-software data migration. Knowledge of OCR tools or automation scripts is a plus. Prior experience working with database entries and spreadsheet organization. If you are reliable, skilled, and ready to start immediately, please get in touch with your portfolio or previous experience in similar work. Skills: Data Entry, Excel, Web Scraping, Data Mining, Data Extraction, Data Analysis, Database Management, Data Management
Hourly rate:
100 - 400 INR
1 hour ago
|
|||||
Expert Data Entry & migration Specialist
|
~1 - 5 USD
/ hr
|
35 minutes ago |
Client Rank
- Risky
1 open job
Registered: Jul 19, 2025
1
|
||
Project Details
₹100.00 – 400.00 INR per hour Bidding ends in 6 days, 23 hours Experienced Data Entry & Data Mining Specialist We are looking for a highly experienced and detail-oriented individual with a strong background in data entry and data mining. The task involves accurately transferring data from our existing software platform into an Excel spreadsheet and then importing it into another software system. Key Responsibilities: Extract and enter data from the existing software system. Organize the data in a structured Excel sheet format. Ensure that all data fields are accurately filled, without errors or omissions. Import or input the cleaned data into the destination software. Maintain strict accuracy and consistency across all entries. Required Skills: Proven experience in data entry and data mining. Strong knowledge of Microsoft Excel (formulas, formatting, data validation). Familiarity with different types of software platforms and data transfer processes. Ability to work independently and meet deadlines. Excellent attention to detail and data accuracy. Preferred Experience: Handling software-to-software data migration. Knowledge of OCR tools or automation scripts is a plus. Prior experience working with database entries and spreadsheet organization. If you are reliable, skilled, and ready to start immediately, please get in touch with your portfolio or previous experience in similar work. Skills Required Data Entry Excel Web Scraping Data Mining Data Extraction Data Analysis Database Management Data Management Skills: Data Processing, Data Entry, Excel, Web Scraping, Data Mining, Data Cleansing, Data Extraction, Data Analysis, Database Management, Data Management
Hourly rate:
100 - 400 INR
35 minutes ago
|
|||||
Automated Loan Report Software Development
|
~7 - 35 USD | 26 minutes ago |
Client Rank
- Risky
1 open job
Registered: Jul 19, 2025
1
|
||
I am seeking a developer to create a Windows-based software solution to streamline the process of generating loan reports. The software should integrate directly with Microsoft Word to automate the editing of a six-page report format. The goal is to minimize manual effort by allowing changes to be made efficiently across different pages and locations within the document.
Key Requirements: - Develop a Windows application with a Graphical User Interface (GUI). - Direct integration with Microsoft Word for seamless document editing. - Automate the entry of land rates from a PDF and integrate data from a government register website. - Provide options for report types: free, already mortgaged, and complex. - Enable single-entry for names and addresses to update throughout the document. - Ensure the entire process can be completed within a few minutes to enhance efficiency. All the things takes place, most of the time with a single click Ideal Skills and Experience: - Proficiency in Windows application development. - Experience with Microsoft Word integration and automation. - Familiarity with PDF data extraction and web scraping for government data. - Strong understanding of GUI design for user-friendly interfaces. - Ability to develop efficient, time-saving software solutions. I am looking for a solution that significantly reduces the time and effort required to produce loan reports. If you have the expertise to develop this software, I would love to hear from you. Skills: PHP, .NET, Web Scraping, Software Architecture, Software Development, Automation, Microsoft Word
Fixed budget:
600 - 3,000 INR
26 minutes ago
|
|||||
Expert Web Scraper Needed for Business Directory Data Extraction
|
30 USD | 25 minutes ago |
Client Rank
- Excellent
$115 290 total spent
80 hires, 5 active
142 jobs posted
56% hire rate,
2 open job
16.42 /hr avg hourly rate paid
5 944 hours paid
5.00
of 48 reviews
Industry: Tech & IT
Company size: 100
Registered: Apr 16, 2007
Ballito
1:01 PM
5
|
||
We are seeking an expert web scraper to extract data from yep.co.za, a prominent business directory. The ideal candidate will efficiently gather relevant business information and present it in a well-organized Excel spreadsheet. Attention to detail and accuracy are crucial for this project. If you have experience in web scraping and can deliver high-quality, formatted data quickly, we would love to hear from you. We require every single business that is listed in their directory, including an Image URL if the listing has an image so that we can import it into our Business Directory
Fixed budget:
30 USD
25 minutes ago
|
|||||
Web Scraping Specialist Needed for Data Extraction
|
not specified | 23 minutes ago |
Client Rank
- Excellent
$41 742 total spent
103 hires, 11 active
139 jobs posted
74% hire rate,
2 open job
8.23 /hr avg hourly rate paid
3 200 hours paid
4.98
of 87 reviews
Registered: Aug 10, 2021
London
1:01 PM
5
|
||
Looking for a web scraper to get necessary data from website.
We have used Octoparse to get some data but have limitations due to the site only showing 1000 values, when in fact there are 5-6000 available. We have used filters such as price to reduce but we need someone to use the filters to get the full 6000 results. This will involve scrpaing multiple times with different filters and removing any duplicates. You will be sent the link to the site that needs scraping and the data that we need. We look forard to hearing from you
Budget:
not specified
23 minutes ago
|
|||||
Web Scraping Specialist Needed for High-Volume Data Extraction
|
30 USD | 10 minutes ago |
Client Rank
- Excellent
$1 310 total spent
18 hires
5 jobs posted
100% hire rate,
1 open job
20.00 /hr avg hourly rate paid
6 hours paid
4.99
of 19 reviews
Registered: May 24, 2022
Bedford
1:01 PM
5
|
||
We are seeking an experienced web scraping specialist to help us extract data from a specific website at scale. The ideal candidate should have a strong understanding of web scraping techniques and tools, as well as the ability to handle large datasets efficiently. Familiarity with data cleaning and processing is a plus. If you have a proven track record in web scraping projects and can deliver accurate results quickly, we want to hear from you!
Fixed budget:
30 USD
10 minutes ago
|
Streamline your Upwork workflow and boost your earnings with our smart job search and filtering tools. Find better clients and land more contracts.