Skip to main content

Model Audits with Dynamo Part 2 - Big Data

My last post touched on further development of model audit scripts with Dynamo.  

There was a desire to capture all of the valuable data collected as part of the audit, just running the audit as an issued state check is very useful but once the Excel spreadsheet used to present and record the results is closed all of the data is lost, furthermore there is additional data in the model which would be useful to capture. 



For example, consider comparing the data for the number of sheets in a model, a fairly broad measure of project complexity and the data showing hours worked from a company time-sheet database. This would allow for more accurate costing on similar future projects.  There are countless similar metrics and with the advent of machine learning upon us capturing vast amounts of data now has never been more important.  We don't yet know what uses there will be for this in years to come. 

In order to record the data it's necessary to connect dynamo to a database, Dynamo has no internal methods of recording data and following a few experiments with mySQL Lite I opted for a mySQL database setup. The additional script needed to export the data to SQL is fairly simple, thanks to the Slingshot custom nodes no additional Python scripts are required.  


The script was laid out neatly with code blocks to allow for future scaling should it be necessar to capture more data as the audit expands.





Once the updated audit was network deployed, all audits run from Dynamo would export the results to the SQL database in the final step of the script. 


A native SQL database is not the most user friendly interface, so to work with and present the data i choose to use PowerBI by Microsoft.  The dynamic user interface is very user friendly and results look great, you can hover over bars of data to get properties and drill down into the data on the fly to create whole new charts. Find out more here  https://powerbi.microsoft.com/en-us/.

I won't go into too much detail on the process of creating the charts etc here, but it's very easy to use and writing queries is incredibly simple.   These are some examples of how i am analysing the data, the reports will display live model status data via a web browser.  




Comments

  1. Great article. but without sharing the dataset, its hard to implement, Please provide.

    ReplyDelete
  2. This is super useful, unfortunately I'm not working with dynamo yet, I have only just scratched the surface, but I do these checks manually at the moment. I would really like to get a hand on this script. would that be possible somehow??

    ReplyDelete
  3. what does dynamo cost?

    ReplyDelete

Post a Comment

Popular posts from this blog

Revit CAD Export Layers to Uniclass 2015

As part of my organisations move towards BIM level 2 i needed to address the way our layers were named when we export CAD files from our Revit models.  Firstly, nobody has ever complained that our layer names were not Uniclass 2015 standard, however, our internal layer naming system seemed to have no standard basis and i wanted a suitable standard for drawings we were producing purely in CAD. The AEC have produced a comprehensive document which explains how to name layers in AutoCAD, for those interested this can be found here  https://aecuk.files.wordpress.com/2016/01/aecukprotocolforlayernaming-v4-0-2.pdf . So... on to the revit export update, i thought, this will be easy, i'll simply visit  https://toolkit.thenbs.com/articles/classification/ , download the .txt file for use in Revit and simply adapt the lineweights to suit our standard and i'll be done!  However, despite an extensive web search i was unable to locate a standard txt file conversion for Revit to...

Model Audits with the help of Dynamo

An important part of our BIM strategy is the validation and auditing of our BIM Geometry models. This has traditionally been a very analogue process of running through company standard compliance checklists and checking the model error logs for duplicate elements, view naming compliance, view template usage, imported linestyles..... yawn..... duplicate elements and so on. This process can easily take half a day and is therefore normally done at key model sharing stages. I wanted to resolve this, I believe every shared model should be audited for quality in order to minimise errors and part of our BIM delivery QA - enter Dynamo. Using the existing paper based checklists as guidance along with a wishlist of extra checks i'd like to conduct i set about creating the equivalent checks but all beautifully automated with Dynamo. This proved far easier and more intuitive than i thought, nodes already existed for many of the checks i wanted to do... "get the views... get t...