Jump to content

looking for MLS rets integration with processwire


laz
 Share

Recommended Posts

Currently, I don't believe there are any free or paid modules for MLS handling (though please correct me if I am wrong). What format is the MLS feed currently in? You could possibly read the feed (in XML) and then use processwire's API to create the listings using a foreach loop. Possibly set up a chron job (a module for that here) to loop through the feed. 

  • Like 1
Link to comment
Share on other sites

Well, as shown in my signature, I've done it. I use a PHRETS-based importer running in a cron job. It really wasn't that hard to write and the code is mostly reusable, but it isn't a module. I'd be glad to elaborate on how I did it if you have questions. I'll say that I'm glad I spent some effort mapping the MLS values to proper ProcessWire fields instead of just saving a JSON dump; the cleanup makes a big difference.

  • Like 3
Link to comment
Share on other sites

Sure, I'd be glad to elaborate. Here's the outline of the update process:

  1. Update the MLS, processing each property type and property status individually for flexibility
    1. Get layout of RETS parent class (In my case Property) and RETS listing type tables (In my case A, B, C, and D) using GetClassesMetadata and GetTableMetadata
    2. Populate list of available MLS entries with only MLS, internal ID, and modification timestamp.
      $connection->Search('Property', 'A', '(LIST_15=ON6KCGQ87YK),(LIST_104=Y),(LIST_12=2013-06-02+)', ['Select' => 'LIST_1,LIST_105,LIST_87','Limit' => 'NONE'])
      The LIST_15 part of the query is a filter for status. Sadly, RETS makes heavy use of lookup fields, so you have to find the code for the value you want on the lookup table before you can find the value to use for the query. Fortunately, this is only an issue for field values used in the RETS queries themselves.
    3. Delete listing pages that no longer exist on the MLS
    4. For each listing without pages or each listing with a changed timestamp {
      1. Populate MLS data using data collected by querying for all fields from only the one property
      2. Update all photos if the photo timestamp changed (iterate the media using a GetObject -- PHRETS makes this easy)
      3. Update all documents if the document timestamp changed (mostly the same as with photos)
  2. Precache the resized images of random listings
  3. Precache the long-lived WireCache snippets of random listings
  4. Aggressively delete unused asset files, including unneeded and obsolete image sizes. 10 GB of photos and documents is plenty. This code also forces a photo update if photos are missing.
  5. Delete expired log files
  6. Prerender homepage and pages linked from it with ProCache
    curl -sS `curl -sS https://website.com/|grep -oE 'listings/([a-z]+)/([a-zA-Z0-9-]+)/|quick-search/([a-zA-Z0-9-]+)/'|sed -e 's/^/https:\/\/website.com\//'|sort|uniq` >/dev/null

I'm sure there's so much I missed; I'll be doing well if this even makes sense! One thing I didn't cover above is the replay mechanism. If I make a change to RETS value parsing, I can easily run a complete offline update using the cached JSON data from each listing. It's a big timesaver.

  • Like 4
Link to comment
Share on other sites

Depends on the association. With this small association, it's included in the membership, and the $10 signup fee didn't sting too much. Now the MLS providers often have richer, easier options available. For instance, FBS, the maker of Flexmls, has a JSON-based IDX API, but their proprietary API wasn't worth $479/year from our perspective.

Link to comment
Share on other sites

I developed modernrealestatesf.com which utilizes MLS listings.

The approach I took was to have my client sign up with simplyrets.com which provides a clean and straight-forward way of accessing the needed listings.  I then wrote a script which processes the data to our needs.  The script runs a couple times a day.  I'm not storing any images in ProcessWire since the images that are in the feed are already stored on Amazon which makes things convenient.

  • Like 1
Link to comment
Share on other sites

  • 4 years later...

I could give you a hand as to how the script works and overall execution, but I can't provide the script.  It's honestly nothing complex though... it just grabs the feed and does the processing the site needs to get the data into the site as pages (or deletes old listing pages based on other criteria).  Given your experience with ProcessWire, it's nothing too out of the ordinary.

  • Like 1
Link to comment
Share on other sites

 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...