Job Title | Budget | ||||
---|---|---|---|---|---|
XCEL Sheet Column Seperation
|
not specified | 1 day ago |
Client Rank
- Excellent
$19 210 total spent
31 hires
44 jobs posted
70% hire rate,
open job
4.81
of 15 reviews
|
||
Hello, I have an XCEL document with 413 contacts on it. Currently the columns are:
Full Name Full Address Each contact needs to be seperated into columns like this. FirstName LastName CompanyName (or Firm Name) Address 1 Address 2 City State Zip Please let me know: Your rate Time frame to complete Thanks!
Skills: Copy & Paste, Data Cleaning, Error Detection, Word Processing, List Building, Data Entry, Microsoft Excel
Budget:
not specified
1 day ago
|
|||||
Digitalize bits of Cambridge University calendar
|
25 USD | 1 day ago |
Client Rank
- Excellent
$16 178 total spent
93 hires
92 jobs posted
100% hire rate,
open job
5.00
of 58 reviews
|
||
The goal of this project is to build a list of fellows from Cambridge in a particular year using data from the Cambridge University calendar.
For each college, find the bit that starts with "the present society" and locate the individuals who have the title "fellow", then enter each person on a separate row, together with the title (junior fellow, senior fellow, divinity fellow, lay fellow....) and college. Sample output in attached pdf. To give a sense a volume, I expect 300-400 rows. Similar jobs available through additional milestones. The document to be digitalized is Google_1800.pdf. The other two are provided for reference only
Skills: Accuracy Verification, Microsoft Excel, Copy & Paste, Data Cleaning, Data Entry
Fixed budget:
25 USD
1 day ago
|
|||||
Database Merging Specialist Needed for Postgres Integration
|
20 - 25 USD
/ hr
|
1 day ago |
Client Rank
- Excellent
$11 173 total spent
7 hires
2 jobs posted
100% hire rate,
open job
5.00
of 1 reviews
|
||
We’re seeking a skilled Database Merging Specialist to help us consolidate data from multiple sources into our main PostgreSQL db. In this role you will:
- Take the following Data Sources - Three JSON‐file sources (each with its own schema and naming conventions) - One relational SQL table - Update and Create Records in our target Postgres Database as defined. - Apply value‐level transformations - Ensure existing records are updated correctly and new records are created - Validate - Develop scripts for data cleaning for existing Data. **What we’re looking for** - Proven experience with SQL (especially Postgres), JSON parsing, and ETL workflows - Strong attention to detail and comfort with both schema‐level and value‐level data transformations - Ability to write clean, maintainable scripts (Python, Node.js, or similar) If you’ve successfully tackled projects involving multiple disparate data sources—especially combining JSON files and traditional SQL tables—and you know how to not only map fields but also reconcile and transform values, we’d love to hear from you! Please share examples of similar work and any preferred tooling or approaches.
Skills: JavaScript, Data Migration, PostgreSQL, SQL, Python, PostgreSQL Programming, Database, ETL, MySQL
Hourly rate:
20 - 25 USD
1 day ago
|
|||||
Lead scraper - build a list of hair transplant clinics (ongoing work available)
|
5 - 15 USD
/ hr
|
1 day ago |
Client Rank
- Medium
$181 total spent
4 hires
7 jobs posted
57% hire rate,
open job
5.00
of 3 reviews
|
||
I'm looking for an experienced lead scraper to help me build highly qualified lead lists of hair transplant clinics that match the following criteria:
- The clinic offers hair transplant procedures - The clinic has a live website with a functioning landing page - The clinic is currently running paid ads to their website (Meta, TikTok, YouTube, or Google Ads) - The clinic is based in or markets to English-speaking countries (e.g., UK, Ireland, US, Canada, Australia, UAE, Turkey targeting English speakers) - The clinic has a Facebook Page or Instagram profile with recent activity (within 90 days) This is NOT a generic scraping job — I need high-quality, hand-picked leads. I need you to gather the following information for each lead: - Clinic name - Website URL - Email address of decision-maker (not a generic “contact-us” email) - Country - Social links - Paid ads platform (i.e. Meta, TikTok, YouTube, or Google Ads) Each email address must be verified and not scraped from a directory. If you're good, this will turn into ongoing work across multiple niches and offers. I’m looking for someone I can rely on long-term to deliver high-quality, accurate leads. To apply, please share the estimated cost per 1,000 qualified leads.
Skills: Data Cleaning, Data Scraping, Lead Generation, List Building, Data Mining, Prospect List
Hourly rate:
5 - 15 USD
1 day ago
|
|||||
Ecommerce Data Entry & Quality Assurance Specialist
|
not specified | 1 day ago |
Client Rank
- Excellent
$218 161 total spent
188 hires
320 jobs posted
59% hire rate,
open job
4.82
of 132 reviews
|
||
We are seeking a detail-oriented Ecommerce Data Entry & Quality Assurance Specialist to join our team. The ideal candidate will have hands-on experience managing product catalogs, including variants, images, descriptions, and titles, within ecommerce platforms. This role involves working closely with senior team members to ensure that product data is accurately formatted and meets the specific requirements of various shopping marketplaces.
Responsibilities: • Manage and update product catalogs, including variants, images, and descriptions, for ecommerce platforms. • Perform data entry and ensure accuracy and completeness of product information. • Format and adjust product titles to comply with marketplace-specific guidelines (e.g., Shopify, Amazon). • Merge product variants and ensure proper categorization and organization. • Use QA tools to verify that data is properly formatted and ready for distribution to marketplaces. • Perform bulk edits using CSV files to streamline product updates. • Ensure high accuracy and quality in a fast-paced environment. • Collaborate with the senior team to implement data quality improvements. • Identify and troubleshoot data discrepancies or errors. Requirements: • Proven experience managing product catalogs on ecommerce platforms like Shopify, Amazon, or similar. • Familiarity with marketplace guidelines and formatting requirements. • Proficiency in bulk editing using CSV files (experience with Excel or Google Sheets preferred). • Strong attention to detail and commitment to accuracy. • Ability to work in a fast-paced environment and meet deadlines. • Self-starter with excellent problem-solving skills and a proactive mindset. • Experience with ecommerce tools and QA processes is a plus. Preferred Skills: • Knowledge of additional marketplaces and ecommerce platforms. • Familiarity with bulk product upload tools or software. • Experience working with cross-functional teams.
Skills: Communications, Online Research, Google Docs, Data Cleaning, Document Conversion, Data Entry, Accuracy Verification
Budget:
not specified
1 day ago
|
|||||
Advanced Excel Data Entry / Extraction
|
5 - 15 USD
/ hr
|
1 day ago |
Client Rank
- Medium
|
||
Please view attachment for full scope of work:
Transfer the correct pricing information into the corresponding spreadsheet titled "CATEGORY - List" by pulling price details from individual subcategory files located within each category folder
Skills: Data Cleaning, Microsoft Excel, Data Entry, Spreadsheet Software, Accuracy Verification, Data Scraping, Data Extraction
Hourly rate:
5 - 15 USD
1 day ago
|
|||||
Social media, brand management, strategy
|
15 USD
/ hr
|
1 day ago |
Client Rank
- Medium
$161 total spent
4 hires
6 jobs posted
67% hire rate,
open job
5.00
of 2 reviews
|
||
social media management, outreach, content creation, emails, engagement, LinkedIn, Shopify support, and other brand related tasks
Skills: Data Analysis, SurveyMonkey, Stata, Survey Design, Data Cleaning, Microsoft Excel, Research Papers, Economic Analysis, Quantitative Analysis, Marketing, Digital Marketing, PPC Campaign Setup & Management, Influencer Marketing, Social Media Management, Ecommerce, Data Collection
Hourly rate:
15 USD
1 day ago
|
|||||
Data Wrangler (Excel Expert) - Fixed
|
250 USD | 1 day ago |
Client Rank
- Medium
$737 total spent
3 hires
2 jobs posted
100% hire rate,
open job
|
||
See attached JD
We're looking for a highly skilled Data Wrangler with expert-level proficiency in Microsoft Excel, including Pivot Tables, formulas, and handling complex spreadsheets. You'll be working with large datasets, correlating information from multiple files, and ensuring data accuracy and consistency.
Skills: Data Cleaning, Data Segmentation, Microsoft Excel, Data Entry, Spreadsheet Software, Data Mining, Data Analysis, Accuracy Verification, Data Visualization
Fixed budget:
250 USD
1 day ago
|
|||||
Quality Control Agent (QC-1001)
|
3 USD
/ hr
|
1 day ago |
Client Rank
- Excellent
$9 098 total spent
55 hires
91 jobs posted
60% hire rate,
open job
4.69
of 24 reviews
|
||
This contract from Performance Home Buyers is intended to fulfill the role of: Quality Control Agent.
The primary responsibility for this role is to verify Call Logs associated with our Outbound Dialing campaigns. At the end of each dialing shift, dialing agents email their End-of-Day (EOD) Submission to management. These submissions contain: (1) Leads; and also any (2) Correct Phone #s that an agent was able to identify during their dialing shift. We also extract and organize Call Logs for entire Marketing Lists after they've been dialed. Call Log errors stem from both System and Agent. This position is for detail-oriented individuals that are exceptional with Microsoft Excel (Must have Desktop App). We're looking for individuals that value (1) Certainty and (2) Dependability, and we're hoping to build lasting relationships that are mutually beneficial. If you're interested in this role - please respond by: (1) Attaching any .xlsx workbook samples that demonstrate your orientation to detail; (2) Showcasing any previous work history associated with Call Logs; (3) Showcasing any previous work history associated with Quality Control.
Skills: Accuracy Verification, Microsoft Office, Error Detection, Data Cleaning, Quality Control, Data Entry, Microsoft Excel
Hourly rate:
3 USD
1 day ago
|
|||||
Data Scientist Needed | Fake News Detection Project (Python, NLP, ML)
|
30 - 50 USD
/ hr
|
1 day ago |
Client Rank
- Risky
|
||
We are seeking a highly skilled and experienced Data Scientist to collaborate on an exciting Fake News Detection project. In today's digital age, misinformation spreads fast—and we want to fight it with machine learning.
This project involves building a robust fake news classifier using Python, leveraging tools like TfidfVectorizer and PassiveAggressiveClassifier to identify and label fake news. If you are passionate about solving real-world problems with data science, we want to hear from you! Project Scope - Use News.csv dataset to train and evaluate a model that can distinguish between real and fake news. - Implement NLP techniques using TfidfVectorizer. - Train and evaluate a PassiveAggressiveClassifier model (or suggest alternatives if better). - Perform data cleaning, preprocessing, and EDA. - Generate insights and visualizations (optional but preferred). - Write clean, modular, and well-documented Python code. - Help deploy or test the model as needed. Tech Requirements - Python (strong proficiency) - Libraries: pandas, numpy, scikit-learn, TfidfVectorizer, PassiveAggressiveClassifier - Experience with NLP and text classification - Strong understanding of machine learning evaluation metrics Ideal Candidate - 6+ years experience in machine learning or data science - Prior experience with fake news detection, text classification, or NLP projects - Excellent communication skills and ability to explain technical concepts clearly - Available to work in or overlap with U.S. time zones Independent, reliable, and deadline-focused
Skills: Python, TensorFlow, Deep Learning, Natural Language Processing, Machine Learning, NumPy, pandas, PyTorch, NLTK, Bot Development
Hourly rate:
30 - 50 USD
1 day ago
|
|||||
Medical Face Mask Classification – Data Preparation Project
|
50 USD | 1 day ago |
Client Rank
- Good
$1 561 total spent
22 hires
42 jobs posted
52% hire rate,
open job
4.66
of 7 reviews
|
||
We're looking for a skilled developer to help prepare a high-quality dataset for a face mask classification project.
🔧 Your Tasks: 1. Dataset Collection & Processing - Find and preprocess at least 3 public datasets containing faces with and without medical masks. - Ensure proper labeling, format consistency 2. Synthetic Data Generation: - Use open-source solutions or custom scripts to programmatically overlay face masks on unmasked faces. Use 3+ methods. 3. Web Scraping: - Write scripts to scrape additional face mask images from the internet (Google, Bing, etc.) 🖥️ Resources Provided: - Access to a GPU-enabled server for all data processing tasks.
Skills: Data Cleaning, Data Labeling, Data Processing, Python, Data Science
Fixed budget:
50 USD
1 day ago
|
|||||
Real Estate Data Entry
|
not specified | 19 hours ago |
Client Rank
- Medium
$139 total spent
1 hires
1 jobs posted
100% hire rate,
open job
5.00
of 1 reviews
|
||
I have approximately 300 loan payoff statements for various properties. I need someone to open each statement and use data entry skills to take certain information out of each document and paste it into a spreadsheet. I expect each payoff will take 1-2 minutes which would make this project a total of 5-10 hours.
I may have more clerical work come up from time to time and will be happy to refer more work in the future if this works out well.
Skills: Data Entry, Computer Skills, Copy & Paste, Data Cleaning, List Building, Real Estate, Clerical Skills
Budget:
not specified
19 hours ago
|
|||||
Customer Data Analysis & Cleaning in Excel
|
~14 - 20 USD
/ hr
|
16 hours ago |
Client Rank
- Risky
1 open job
Registered at: 17/04/2025
|
||
I'm seeking help with cleaning and analyzing my customer data using Excel. The project involves:
- Data Cleaning: You'll be tasked with removing duplicates, correcting errors, and standardizing formats within the dataset. - Data Analysis: Once the data is clean, you'll perform descriptive statistics to derive insights. Ideal candidates for this project should have: - Extensive experience with Excel, particularly in data cleaning and analysis. - Strong attention to detail to ensure the data is accurately prepared. - Ability to interpret data and provide clear, concise statistical insights. Skills: Data Entry, Excel, Statistics, Statistical Analysis, SPSS Statistics
Hourly rate:
12 - 18 EUR
16 hours ago
|
|||||
Excel Gantt chart formula support
|
not specified | 16 hours ago |
Client Rank
- Risky
|
||
I need help building out this gantt chart and understanding the formulas and changing the dates and timelines
Skills: Microsoft Excel, Data Cleaning, Data Analytics, Data Analysis, Google Sheets, Dashboard, Data Visualization, Excel Formula, VLOOKUP, Visualization, Data Modeling, Data Interpretation, Spreadsheet Software, Interactive Data Visualization, Power Query
Budget:
not specified
16 hours ago
|
|||||
Data Entry Clerk with Microsoft Excel or Bookkeeping Skills
|
20 - 30 USD
/ hr
|
16 hours ago |
Client Rank
- Medium
1 jobs posted
open job
|
||
Only freelancers located in the U.S. may apply.
Applicants must submit a PDF version of his/her resume
Company Name: * Eviation Alice, Inc. Responsibilities: * Accurately type and input data from printed or handwritten documents into company's cloud databases or systems. * Review, verify, and correct data to ensure accuracy and completeness. * Maintain confidentiality and security of all data, manage time efficiently. Requirements: * High school diploma or equivalent. * Fast and accurate typing skills (minimum 30 WPM preferred). * Basic knowledge of Microsoft Office (especially Word and Excel) or Google Workspace. * Excellent attention to detail and organizational skills. * Ability to work independently and manage time effectively.
Skills: Computer Skills, Company Research, Accuracy Verification, Online Research, Communications, Email Communication, Proofreading, Typing, Document Conversion, List Building, Product Listings, Quality Control, Word Processing, Batch Proof Reports, Copy & Paste, Daily Deposits, Data Cleaning, Microsoft Office, Microsoft Word, Roboflow, CRM Software, ERP Software, Medical Records Software, CVAT, Customer Service, Clerical Procedures, Data Entry, Microsoft Excel, Administrative Support, Google Docs, Error Detection, Customer Care, Bookkeeping, Light Bookkeeping, Virtual Assistance
Hourly rate:
20 - 30 USD
16 hours ago
|
|||||
Microsoft Excel Expert for Data Cleaning and Analysis
|
100 USD | 14 hours ago |
Client Rank
- Medium
8 jobs posted
13% hire rate,
open job
|
||
We are seeking a skilled Microsoft Excel expert to assist with data cleaning, analysis, and intelligence for our existing Excel sheets. The ideal candidate should have a strong understanding of Excel functions, data manipulation techniques, and analytical skills to derive insights from raw data. If you are detail-oriented and have experience in transforming data into meaningful information, we want to hear from you!
Skills: Microsoft Excel, Data Analysis, Data Entry, Data Visualization
Fixed budget:
100 USD
14 hours ago
|
|||||
LinkedIn Lead Generation: Lawyers
|
3 - 20 USD
/ hr
|
14 hours ago |
Client Rank
- Medium
$529 total spent
3 hires
1 jobs posted
100% hire rate,
open job
5.00
of 2 reviews
|
||
I am looking for someone to use LinkedIn search functions to generate a list of people matching criteria I set and to then scrape the internet to find email addresses for them for direct reach out.
Skills: Data Cleaning, Legal, Lead Generation, List Building, Data Scraping, Data Mining, Prospect List, Market Research
Hourly rate:
3 - 20 USD
14 hours ago
|
|||||
Data Entry - (Excel Data Cleaning)
|
10 USD | 14 hours ago |
Client Rank
- Excellent
$6 941 total spent
333 hires
318 jobs posted
100% hire rate,
open job
4.99
of 295 reviews
|
||
We are looking for a detail-oriented Data Entry Specialist to clean, format, and organize existing data in Excel sheets. Your main tasks will include removing duplicates, correcting errors, standardizing formats, and ensuring data accuracy. You should have strong Excel skills, including knowledge of data cleaning tools (Remove Duplicates, TRIM, Text to Columns, etc.).
Skills: Microsoft Excel, Data Entry, Accuracy Verification, Spreadsheet Software
Fixed budget:
10 USD
14 hours ago
|
|||||
Data Entry and List Building Specialist for Self Storage Leads in Canada
|
5 USD | 13 hours ago |
Client Rank
- Excellent
$978 total spent
91 hires
81 jobs posted
100% hire rate,
open job
4.99
of 87 reviews
|
||
We are seeking a detail-oriented Data Entry and List Building Specialist to gather and organize leads for self storage facilities across Canada. The ideal candidate will have experience in data collection, strong attention to detail, and the ability to work independently. Your primary responsibility will be to compile a comprehensive list of self storage locations, including their contact information and relevant details. If you are proficient in data entry and have a knack for research, we would love to hear from you!
Skills: Accuracy Verification, Company Research, Copy & Paste, Data Cleaning, Data Entry, Lead Generation, Data Scraping, Market Research, Data Mining, Microsoft Excel, List Building, Prospect List
Fixed budget:
5 USD
13 hours ago
|
|||||
Data Entry - (Excel Data Cleaning)
|
10 USD | 12 hours ago |
Client Rank
- Excellent
$13 873 total spent
717 hires
529 jobs posted
100% hire rate,
open job
4.99
of 672 reviews
|
||
We are looking for a detail-oriented Data Entry Specialist to clean, format, and organize existing data in Excel sheets. Your main tasks will include removing duplicates, correcting errors, standardizing formats, and ensuring data accuracy. You should have strong Excel skills, including knowledge of data cleaning tools (Remove Duplicates, TRIM, Text to Columns, etc.).
Skills: Microsoft Excel, Data Entry, Accuracy Verification, Spreadsheet Software, Database, Google Docs, Sales Lead Lists, Data Mining, Error Detection
Fixed budget:
10 USD
12 hours ago
|
|||||
Leads for Accounting practice around Dallas, TX area
|
not specified | 12 hours ago |
Client Rank
- Risky
|
||
Here is my initial thinking but open to your recommendations based on your prior work with this task.
Must have: Tax and Accounting firm within 150 miles to Plano, TX 75024 Revenue between $500,000 - $1,100,000 30% - 50% of Revenue Bookkeeping/Payroll Must prepare tax returns Nice to have: Owner is 60 years old or approaching a retirement age Has multiple employees, ideally another CPA/EA on staff. I look forward to hearing from you on our next steps. This is my first time using Upwork so please let me know how to proceed and what to expect. Boris
Skills: Lead Generation, Market Research, Startup Company, Campaign Management, Data Entry, Business Development, Contact List, List Building, Company Research, Email Marketing, LinkedIn Development, Data Cleaning
Budget:
not specified
12 hours ago
|
|||||
List Research / List Building / Data Entry
|
3 - 7 USD
/ hr
|
12 hours ago |
Client Rank
- Excellent
$35 889 total spent
5 hires
4 jobs posted
100% hire rate,
open job
5.00
of 1 reviews
|
||
Help to clean, research and build a prospecting out reach list.
We already have the basic contact data. These contacts likey have business or are associated with a dental practice. Your job will be to research them and find them or their business online so we can use that data in our outbound prospecting. This is our first list, and we are still fine learning about this type of marketing and fine tunign our proccess. This is a short project but there is potential for future projects if we enjoy working with you.
Skills: List Building, Data Cleaning, Data Entry, Data Mining, Microsoft Excel, Prospect Research, Contact List
Hourly rate:
3 - 7 USD
12 hours ago
|
|||||
Business Data Analytics Python Assignment
|
20 USD | 12 hours ago |
Client Rank
- Good
$617 total spent
10 hires
18 jobs posted
56% hire rate,
open job
5.00
of 6 reviews
|
||
This is a university assignment. You have to follow the instructions given in the start. In order to do the assignment open the colab link: (link removed)
and upload the csv file that is attached, it has the data this is to be used in the assignment. Then start writing the code where required and answer in text form where that is required, in some questions both will be required. Do not change the template. 1. Do not modify the notebook structure. 2. Do not combine multiple answers in a single cell. Write each answer in its designated section 3. Do not use AI, if that is detected work will not be accepted and payment will not be made with a bad review 4. In some of the questions part of the code is already written you don't have to remove that you have to keep writing after that, so for example code related to importing libraries is written already so you don't have to write that again you just have to continue after it. 5. Do not write complex code, keep it easy and simple (beginners/intermediate) level
Skills: Data Analytics, Human Resources Analytics, Python, Data Analysis, Data Visualization, Microsoft Excel, Data Cleaning, Statistics, Data Science
Fixed budget:
20 USD
12 hours ago
|
|||||
Python Web Scraping Expert Needed to Extract Data from HiBid Texas Company Search
|
5 USD | 11 hours ago |
Client Rank
- Excellent
$5 195 total spent
24 hires
22 jobs posted
100% hire rate,
open job
4.95
of 16 reviews
|
||
Hi there!
I'm looking to hire an experienced Python developer who specializes in web scraping to extract structured data from the following website: https://hibid.com/texas/companysearch Scope of Work: I need to extract the following details from the search results for all listed companies: Company Name Auctioneer Name License Number Phone Number (if available) Email Address (if available) Location (City, State) Website URL (if listed) The data should be scraped for all entries available through the search or pagination system. Deliverables: Clean Excel/CSV file containing the extracted data Python script used for scraping (well-commented) Short README with instructions to run the script Important Notes: Handle pagination (if applicable) Avoid IP blocking or throttling (use appropriate delays or rotating user agents) Solution must comply with the website’s Terms of Service Script should be modular so it can be reused or modified later Bonus Points If You: Have experience with Selenium or BeautifulSoup Can deliver within 2-3 days Provide data validation/cleaning 🛠 Preferred Skills: Python Web Scraping (BeautifulSoup, Selenium, or Scrapy) Pandas Data Cleaning Experience scraping auction or directory-style websites Please include: A short summary of your relevant experience Looking forward to working with a skilled professional on this task! Thanks!
Skills: Beautiful Soup, Selenium, Automation, Python, Web Crawling, Data Mining, Web Scraping
Fixed budget:
5 USD
11 hours ago
|
|||||
Advanced Web Scraping Script for Dynamic Website (Selenium + Python + Excel + Error Handling)
|
455 USD | 10 hours ago |
Client Rank
- Risky
|
||
📌Project Overview:
We are looking for an experienced Python developer to build a modular, scalable, and robust web scraping script to extract structured data from a dynamic automotive website. The site relies heavily on dropdown interactions and dynamically loaded content (JavaScript-based). The goal is to extract vehicle configuration data and corresponding tyre size options and output them in structured Excel and JSON formats. The script should include: • Advanced Selenium logic with wait handling and retry strategies, • Tyre data extraction using BeautifulSoup from rendered HTML, • Full logging and error tracking, • Retry mechanism for failed URLs, • Modular and maintainable code structure. ________________________ 📦 Deliverables: 1. Fully functional web scraping script o Built with Python 3, Selenium WebDriver, and BeautifulSoup o Interacts with dropdowns (manufacturer, model, type, year, engine) o Generates dynamic combinations and scrapes corresponding tyre data 2. Data outputs o All combinations stored in combinations.json o Clean tyre size data saved in per-manufacturer Excel files (e.g. Toyota.xlsx) o Separate JSON file with failed URLs and retry results (errors.json, retry_results.json) 3. Documentation o README.md with setup instructions o List of extracted fields and dropdown dependencies o Troubleshooting and retry instructions 4. Logs and metrics o Log file with timestamps for scraping progress and errors (log_YYYYMMDD.txt) o Optional performance.json with time-per-combination metrics ________________________ ⚙️ Script Requirements: ✅ Core Functionality: • Python 3 with Selenium • Dynamic dropdown interaction with wait/retry logic (WebDriverWait, exception handling) • HTML parsing via BeautifulSoup • URL generation for each combination • Extraction of front/rear tyre sizes • Detection of identical or mixed tyre types 📊 Output: • Excel output by manufacturer with structured columns: o Manufacturer, Model, Type, Year, Engine o Front Tyres 1–10, Rear Tyres 1–10 • JSON files: combinations.json, errors.json, retry_results.json ________________________ 🧪 Error Handling: • Robust error capture for common Selenium issues (Timeouts, StaleElementReference) • Retry logic for failed links • Separate error tracker with retry support • Logging for all failed attempts ________________________ 📈 Performance and Resilience: • Optional human-like delay between actions (randomized) • Time tracking for each major operation (scraping, parsing, saving) • Graceful handling of broken DOM states and failed loads • Script can resume from last completed manufacturer (output saved per unit) ________________________ 🔧 Configuration: • External configuration support (JSON or .env file): o List of manufacturers to scrape o Delay settings, max retries o Output paths ________________________ ✅ Preferred Developer Skills: • Proven experience with Selenium for dynamic websites • Strong understanding of web page rendering and async content • Good Python structure and code modularity (PEP8, reusable functions) • Experience with data cleaning and export to Excel/CSV/JSON • Ability to write robust, debuggable scripts with retry logic and logging ________________________ ⏱ Budget: • MAX Budget: $455 (can vary based on experience and depth of implementation) ________________________ 📝 To Apply: Please include: 1. Examples of dynamic web scraping projects you’ve completed (preferably Selenium-based) 2. Description of how you handle: o Dropdown interactions o Retry strategies o Tyre or similar structured data parsing 3. Suggested timeline for delivering a working version and full documentation 4. Optional: GitHub or portfolio link
Skills: Web Scraping, Beautiful Soup, Data Extraction, Python, pandas, Selenium WebDriver, JSON
Fixed budget:
455 USD
10 hours ago
|
|||||
Real Estate Cold Calling Specialists
|
not specified | 10 hours ago |
Client Rank
- Excellent
$66 784 total spent
54 hires
22 jobs posted
100% hire rate,
open job
4.52
of 33 reviews
|
||
Real Estate Cold Caller
Type: Full-Time/Part-Time/Contract Salary: [$5-$8/hour] Company: Dynamic real estate firm specializing in residential properties. Job Summary: We seek a motivated Real Estate Cold Caller to generate relationships through outbound calls to potential clients, build rapport, qualify prospects, and schedule appointments for company. Responsibilities: Make high-volume outbound calls using provided scripts and lead lists. Qualify leads and schedule follow-ups. Update CRM with call details and lead statuses. Follow up with prospects to nurture relationships. Stay informed on local real estate trends. Qualifications: Experience in cold calling or telemarketing can be a plus. Self-motivated, results-driven, and organized. Familiarity with CRM and dialer systems preferred. Reliable internet and quiet workspace. Skills: Active listening Time management Resilience Communication
Skills: Data Entry, Telemarketing, Email Marketing, Data Cleaning, Credit Repair, Sales Call, Appointment Scheduling, Outbound Call, Real Estate Cold Calling, Cold Calling, B2C Marketing, B2B Marketing, Telemarketing Scriptwriting
Budget:
not specified
10 hours ago
|
|||||
Find Business Emails and Phone Numbers for List of Prospects
|
75 USD | 9 hours ago |
Client Rank
- Good
$3 965 total spent
15 hires
24 jobs posted
63% hire rate,
open job
3.65
of 4 reviews
|
||
Hello!
I'm loooking for help with finding business emails and phone numbers for a list of prospects. I would also like help sanitizing this list to delete firms such as KPMG, Deloitte, PwC, Price Water House Coopers, EY, Ernst & Young, H&R Block, HR Block, Block Advisors, BDO, JP Morgan. Brandon M.
Skills: Data Mining, Lead Generation, List Building, HubSpot, Customer Relationship Management, Data Migration, Email Marketing, Data Cleaning, Prospect List, Social Media Lead Generation, Market Research, LinkedIn Lead Generation, Search Engine
Fixed budget:
75 USD
9 hours ago
|
|||||
Data Science Project Seeking Expertise
|
~18 - 146 USD | 19 hours ago |
Client Rank
- Excellent
$25 232 total spent
30 hires
, 1 active
1 open job
5.00
of 2 reviews
Registered at: 02/04/2023
|
||
I am looking for a seasoned data scientist to assist with a project. The specifics of the project are not yet defined, as I am open to suggestions based on your expertise and experience.
Ideal candidates would have skills in: - Predictive modeling - Data cleaning and preprocessing - Data visualization Please provide your ideas on how we could leverage the data to either increase business efficiency, gain insights, or develop a new product feature. Skills: Excel, SPSS Statistics, Data Science
Fixed budget:
1,500 - 12,500 INR
19 hours ago
|
|||||
AI-Powered Business Acquisition Analysis Tool Development
|
1,000 USD | 4 hours ago |
Client Rank
- Excellent
$30 322 total spent
41 hires
67 jobs posted
61% hire rate,
open job
4.91
of 32 reviews
|
||
We are seeking a skilled and experienced developer to build an AI-powered tool that can analyze business acquisition deals. This tool will streamline the due diligence process by automating the extraction and analysis of data from various sources. The ideal candidate will be able to propose and implement a robust solution, potentially using workflow automation platforms like n8n or suggesting suitable alternatives. The successful deliverable should be a functional tool that can be packaged and sold to other businesses looking to improve their acquisition analysis.
**Project Objective:** The primary objective is to create a modular tool that can efficiently process data related to potential business acquisitions and provide actionable insights. This tool should help users quickly assess the financial health, customer base, legal risks, and market position of a target company. The ultimate goal is to develop a product that can be offered as a service or software to other businesses to aid in their acquisition decision-making. **Key Requirements and Functionalities:** The tool should be developed in distinct, manageable sections (modules) to facilitate development and testing. We anticipate the following core modules: * **Data Ingestion and Preprocessing:** * Ability to ingest data from various file types (PDF, CSV, XLSX, DOCX, TXT, images). * OCR capabilities to extract text from scanned documents. * NLP techniques for extracting key information from textual data (contracts, etc.). * Data cleaning and validation. * Data structuring for subsequent analysis. * **Financial Analysis:** * Parsing and standardization of financial statements. * Time series analysis and financial forecasting. * Financial ratio calculation. * Valuation modeling (DCF, Comparable Company Analysis, etc.). * Financial risk assessment. * **Customer Analysis:** * Customer data processing and structuring. * Customer segmentation. * Churn prediction. * Key customer identification. * **Legal and Contractual Analysis:** * Contract parsing and key clause extraction. * Due diligence checklist integration. * Legal risk identification. * **Market and Industry Analysis:** * Market data aggregation. * Competitive analysis. * Trend identification. * **Synthesis and Reporting:** * Data aggregation and weighted scoring. * Valuation range and recommendation. * User-friendly interactive dashboard. * Natural language explanations. * Report generation and export. **Desired Deliverables:** * A fully functional tool that implements the specified modules. * Clear and comprehensive documentation on how to use, configure, and maintain the tool. * Well-structured and maintainable code. * A plan or suggestions for packaging and distributing the tool (e.g., as an n8n template, a self-contained application, or an API). * A basic user interface or method for others to use a tool like Google Drive to upload their documents and receive the output report **Technology Stack:** * While n8n is a preferred workflow automation platform, we are open to alternative suggestions if a more efficient or suitable solution is available. * The developer should be proficient in relevant technologies such as: * JavaScript/Python (for scripting and AI/ML integration). * OCR libraries/APIs. * NLP libraries/APIs. * Data manipulation and analysis libraries. * Database technologies (if needed). * API integration. **Payment Structure:** To ensure clear milestones and fair compensation, we propose a payment structure based on the completion of each module: * **Payment 1:** [$200] upon successful completion and testing of the "Data Ingestion and Preprocessing" module. * **Payment 2:** [$200] upon successful completion and testing of the "Financial Analysis" module. * **Payment 3:** [$200] upon successful completion and testing of the "Customer Analysis" module. * **Payment 4:** [$200] upon successful completion and testing of the "Legal and Contractual Analysis" and "Market and Industry Analysis" modules. * **Payment 5:** [$200] upon successful completion of the "Synthesis and Reporting" module, comprehensive documentation, and delivery of the final tool. **Proposal Requirements:** * Please provide a detailed proposal outlining your approach to developing this tool. * Specify your preferred technology stack (if different from n8n). * Include your relevant experience and portfolio. * Provide an estimated timeline for completing each module. * State your proposed budget for each payment milestone. * Outline your communication and project management process. **Important Considerations for Bidders:** * This project requires a strong understanding of data processing, automation, and potentially AI/ML. * Effective communication and collaboration are essential. * The ability to provide clear and concise documentation is crucial. * The developer should be able to think critically about how to make this tool user-friendly and marketable to other businesses. We look forward to receiving your proposals and finding a talented developer to bring this exciting project to life!
Skills: Python, Natural Language Processing, n8n, JavaScript, API Integration, Machine Learning
Fixed budget:
1,000 USD
4 hours ago
|
|||||
🔍 Lead Generation Specialist – Real Estate (Property Managers & Developers)
|
not specified | 2 hours ago |
Client Rank
- Excellent
$718 273 total spent
123 hires
90 jobs posted
100% hire rate,
open job
4.82
of 74 reviews
|
||
Featured
We're hiring a lead generation specialist with experience in real estate, specifically targeting property managers and real estate developers.
You’ll be responsible for building a pipeline of qualified leads by: - Identifying decision-makers (e.g., CFOs, Directors of Property Management, Controllers) - Researching and qualifying contacts within accounts we provide - Sourcing new target accounts that match our ICP Using advanced search strategies (LinkedIn, Apollo, ZoomInfo, Clay, etc.) to find: - Mobile phone numbers - Work and personal emails - Office phone numbers - LinkedIn profiles - Any other viable contact methods - Delivering clean, organized lead lists with complete contact profiles We’ll work together to set clear goals, and you’ll be expected to monitor your own metrics and report on progress. This includes things like number of leads sourced, contact accuracy, and volume of accounts researched weekly. About Us: We're a fast-growing tech-enabled company transforming accounting services for property managers and developers. Our clients rely on us to reduce complexity, increase accuracy, and scale their operations. Who You Are: - 2+ years of experience in lead generation or B2B list building - Strong knowledge of the real estate sector, especially property management and development - Skilled in sourcing multiple verified contact methods per lead - Proficient with tools like LinkedIn Sales Navigator, Apollo, Clay, ZoomInfo, Skrapp, etc. - Detail-oriented, proactive, and highly self-managed - Comfortable setting and tracking your own performance metrics Bonus if you have: - Familiarity with AppFolio, Yardi, or Buildium - Experience working with sales teams at fast-paced startups To Apply: Please include: - A short summary of your real estate lead generation experience - Examples of lead lists or campaigns you've worked on - A list of tools and methods you use to source complete contact profiles - Any performance metrics you’ve tracked in past roles This is a pilot project with potential for long-term, ongoing work if successful.
Skills: Data Cleaning, Lead Generation Analysis, Lead Generation Strategy, HubSpot, Lead Generation, Data Scraping, List Building, Prospect List
Budget:
not specified
2 hours ago
|
|||||
Data Scraper – Floor Plans & BOMs
|
not specified | 1 hour ago |
Client Rank
- Medium
$62 total spent
1 hires
4 jobs posted
25% hire rate,
open job
5.00
of 1 reviews
|
||
Freelance Data Scraper – Floor Plans & BOMs (Phase 1: Data + Prototyping)
📍 Remote | Freelance | (Phase 1) 💰 Fixed price proposals only The Opportunity We're looking for a freelance data scraper to help us gather and organize datasets of floor plans and BOMs for the first phase of prototyping. Your Role 📂 Collect a wide range of floor plans (PDF, JPG, PNG, CAD/DWG) 🧾 Collect Bills of Materials (BOMs) or detailed material schedules/takeoffs 📊 Organize and label data: project type, region, size, format, etc. 🧼 Clean and standardize files so they’re usable by our AI/ML team 🔍 Track sources and provide a summary of where/how data was collected Where You’ll Find the Data You’ll need to legally scrape or collect data from publicly available sources, including: 🔹 Real estate sites (e.g. Zillow, Realtor.com, Rightmove) 🔹 Architecture directories (e.g. ArchDaily, Floorplanner, CADdetails) 🔹 Government planning portals (public applications) 🔹 Architect portfolios and educational libraries 🔹 Open-source datasets (e.g. GitHub, Kaggle) 🔹 Forums or construction websites with shared templates/resources We are looking for diverse data: residential and commercial, different styles, and global regions. Deliverables ✅ At least 500 unique floor plans and 100 BOMs ✅ Organized folder system and metadata spreadsheet ✅ Weekly updates on progress ✅ All data must be legally obtained and clearly sourced Ideal Candidate ✅ Proven scraping experience (Python, BeautifulSoup, Selenium, Scrapy, etc.) ✅ Understands architectural formats and terminology (bonus if you've worked in AEC before) ✅ Detail-oriented with strong file/org skills ✅ Reliable and communicative Please include: - A short summary of your scraping/data collection experience - What tools/tech you use - Any relevant sample work or links - Your availability and rate
Skills: Web Scraping, Python, Data Extraction, Microsoft Excel, Data Entry, File Management, CAD Software, Data Cleaning, Data Mining, Data Scraping
Budget:
not specified
1 hour ago
|
Streamline your Upwork workflow and boost your earnings with our smart job search and filtering tools. Find better clients and land more contracts.