Yes, the upgrade completely broke my IFTTT integration. I am just finalizing my switch over to MQTT. A lot of work but also learned a lot.
Old solution:
IFTTT messages from my-mods logged to google sheets. Google sheets then act as database for data visualization.
Pros: Simple and sweet.
Cons: Not very flexible within scope of IFTTT and Google. Prone to hiccups on 3rd party cloud side of solution (i.e.
IFTT would just stop receiving messages for hours, code updates seemed to tweak the column output to google sheets. 1000 row limit per sheet, so visualizations require append/union of sheets, and additional parsing for column discrepancies.)
New Solution:
MQTT JSON data published from Mc-mods. Python ( using paho and Postgres library ) script on Raspberry pi subscribes and parses JSON into a Postgres DB (also on pi) insert statement. Postgres is now DB for visualization.
Pros: more robust. Can point to a different Postgres host any time. Python allows for post-processing of data and data augmentation. Python gives much more control and allows for pretty much anything else I can imagine to do with it.
Cons: needed to learn python (and paho, Postgres libs, lots of samples out there, but googled answers frequently), more complex, but I am in control. devices.date() does not seem to capture on boot prior to first publish, so first boot record is bad device date data (year 2000). Something strange with needing to connect via gateway first. Tried some boot event code with limited success. I now capture 3 dates in Postgres, device.datetime, python sub receive datetime and Postgres auto datetime for record write.
Both solutions capture and log the device.id . I use a lookup table to join the ID for the location the sensor is located. I need to add valid-from and valid-to date fields to the lookup so I can swap out devices and still get proper historicals.
I consider my solution a beta 2. I'll post on github once I hit beta 3 or 4 if anyone is interested.
Regards,
Brad