Job Description
We are seeking an experienced AI and data engineer to build a lot‑level real‑estate analytics platform focused on disaster‑affected lots in USA. The goal is to give buyers richer insights than standard portals by combining Multiple Listing Service (MLS) data with Los Angeles City’s Zone Information and Map Access System (ZIMAS) and other public records. The tool will update daily, merge sales, zoning and property‑record data, and deliver an interactive dashboard with advanced filters, comparables and alerts.
Core Responsibilities:
Data ingestion & integration:
- Establish an automated pipeline to pull MLS listing data—active, pending and sold—including historical sales (prior transactions with sale price and date). MLS feeds contain hundreds of fields beyond what consumer portals display.
- Merge MLS data with ZIMAS records to add zoning codes, hillside designations and fire‑zone status.
- Incorporate building‑permit and property‑record data from LADBS (or others) to capture structures and improvements on each lot. LADBS’s building records include building permits, certificates of occupancy and range files (violations), and the Residential Property Report provides comprehensive property details and past history. These sources will help identify existing structures, improvements, and any outstanding code violations.
- Design workflows to fetch or scrape code‑enforcement data (e.g., LA Housing Department or LADBS code‑violation dashboards) so that buyers can see if a lot has unresolved compliance issues.
Database design:
- Create a schema that stores each lot’s identification, physical characteristics (size, slope, view), historical sale prices and dates, current listing data, price history, zoning overlays, building‑permit history, structures/improvements, and any recorded code violations.
- Implement daily updates and maintain a change log.
Analytics and AI:
- Develop algorithms to classify lots by terrain (flat vs. hillside), price tier and market activity. Use AI techniques as needed to derive slope categories from topographic data or to match code‑violation records to each parcel.
- Generate comparable sales and compute metrics like price per square foot, days on market trends, and price‑reduction frequency.
Dashboard development:
- Build a user‑friendly web interface (Streamlit, Dash, React or similar) with filter widgets for micro‑neighborhood, days on market, slope category, price range, lot size, pre‑fire structure size, and now historical sale criteria, presence/absence of structures or improvements, and code‑violation flags.
- Include an interactive map and comparables list, plus charts summarizing price per square foot, inventory levels, investor activity, historical price appreciation and compliance status.
- Users should be able to run complex queries like “Alphabet Streets, DOM greater than 90 days, hillside, pre‑disaster structure larger than 3 000 sq ft, no existing code violations, sold at least once since 2018.”
Alerts and reporting:
- Implement an alert system for new listings, price changes, or matches to saved searches. Provide downloadable reports (CSV/PDF) with all captured fields, including historical sales, improvements and code‑violation status.
Required Skills and Experience:
- Experience with data ingestion from APIs or web scraping (MLS, GIS, building permits, code‑violation data).
- Proficiency in Python and data‑processing libraries; familiarity with relational and geospatial databases.
- Strong background in dashboard development (Streamlit, Dash, Plotly, Power BI, or similar).
- Knowledge of real‑estate data structures; experience working with MLS feeds or property‑record datasets is a plus.
- Bonus: experience integrating building‑permit data, code‑violation records and historical sales data from municipal sources.
Deliverables:
- A working prototype that ingests MLS, ZIMAS and property‑record data, updates daily, and displays the merged information in a functional dashboard.
- Source code and documentation for the data pipeline and database schema.
- Recommendations for scaling the system and incorporating additional datasets (e.g., county assessor files, permit histories, code‑violation dashboards).
General Tool Name:
This project could be described as a Real‑Estate Market Insight Dashboard or Lot‑Level Analytics Platform—terms that convey its purpose of aggregating sales, zoning, improvement and compliance data to deliver actionable insights beyond what standard portals provide.
Apply tot his job
Apply To this Job