Skip to main content

Model Audits with the help of Dynamo

An important part of our BIM strategy is the validation and auditing of our BIM Geometry models. This has traditionally been a very analogue process of running through company standard compliance checklists and checking the model error logs for duplicate elements, view naming compliance, view template usage, imported linestyles..... yawn..... duplicate elements and so on. This process can easily take half a day and is therefore normally done at key model sharing stages. I wanted to resolve this, I believe every shared model should be audited for quality in order to minimise errors and part of our BIM delivery QA - enter Dynamo.

Using the existing paper based checklists as guidance along with a wishlist of extra checks i'd like to conduct i set about creating the equivalent checks but all beautifully automated with Dynamo. This proved far easier and more intuitive than i thought, nodes already existed for many of the checks i wanted to do... "get the views... get the names... filter ones which don't have... etc. etc".

Once i had the data and results i needed an interface with which to display the results, for this i used Excel, i love excel, once the data is in excel, a few If () statements are easier to write and you can quantify results easily.

The interface sits as a shared spreadsheet on the server with some simple scripts which clear the data when it's closed. Anyone in our organisation can then run the audit with no user input required simultaneously from Dynamo and print the results when a model is issued.

The hope is that as Dynamo evolves we can automate more and more checks on our models before they are Shared on the CDE.

The audit is complete within a minute, a time saving of easily a couple of hours from the traditional process, this means every model that leaves our office has been audited and is required to pass a certain base level of quality control before Document Control will issue models externally.

Future development is already in place to export this data as the audits are run and store this valuable data in a SQL database. This data will then be used for useful analytics and to give an overview of model health throughout the organisation.


  1. I shared your article on theBIMhub

  2. This looks good thank you. There are a few others that are doing similar... the Carolinas Health Care System has a system specific reporting plugin for file compliance for their model checking system... Shared Parameter Found in Model- Shared Parameter Group, Category ; Missing Shared Parameters; OmniClass Numbers- OmniClass, Element Category, Type Name, and Result; Family Naming - Element Category, Family Name and Result; Worksets- Workset Name, Workset ID, and Result; Floor Plan Names- Floorplan Name, Floor Plan Level, Errors in compliance; Wall types- Wall Type, Type Mark, Fire Rating; Fire Test#, UL Url, Function, Assembly Code, With, Errors found with compliance.

    Then they check the Discipline and Phase of project along with Model Phases for each.
    Then with a Content Check result page in the UI that they have developed they have an Overview, Project Information, Appendix B, Room Schedule, Archibus, Attainia, Bed License, Doors, Finishes, Furniture, and Walls.

    ..... Then at the firm that I am with we are additionally concerned with the global project file performance internally in seeing:
    From the network drive we scan nightly for:
    -- New Projects.
    -- Major Version Change
    -- Central Model Change
    -- Weekly File Size Jump
    -- Daily File Size Jump
    -- Exceed Max Size Threshold
    -- New Corrupted Files

    From the nightly .slog data in the bak folder (Synchronous Log) on workshared files
    -- Operation Crash - If any "Crash" operations happen.
    -- Operation Saving Error - If any "STC:Save:Error" operations happen.
    -- Multi Versions - If users access a central model with different Revit builds.
    -- Multi Offices - If users access a central model from diffrent location.
    -- Multi Sessions - If more than 5 users access a central model concurently.
    -- Long Sessions - If a user access a central model longer than 48 hours.

    then on the file every save we have coded a routine to get:
    Central : \\Path...
    Date :
    Type : Project Report
    Host Id : (System Name)
    User :
    Client Name :
    Building Name :
    Project Status :
    Project Issue Date :
    Linked RVT Files :
    Linked DWG Files :
    Inserted DWG Files :
    Inserted Image Files :
    Total # of Sheets :
    Unplaced Views :
    Group Instances in Model :
    Group Not Placed :
    Model Groups :
    Detail Groups :
    Design Options :
    Phasing :
    Line Styles :
    Line Patterns :
    Text Styles :
    Dimension Styles :

    Harvesting all of this data into a SQL DB to track it over time and get some Key Performance indicators and track # of persons accessing the file and how it degrades the performance. We can also see with this data and internal info we can get multi-office info working on projects to set up the WAN acceleration to also improve file performance.

    All good stuff

    There are also a few tools out that are doing some of this stuff like Imaginit Clarity and Kinship tracking the file and performance metrics. But Dynamo is free and work only one file at a time.


  3. This comment has been removed by the author.

    1. Looks promising. Anyway to download this?



    2. Looks great ! is there a way to download it ?



Post a Comment

Popular posts from this blog

Model Audits with Dynamo Part 2 - Big Data

My last post touched on further development of model audit scripts with Dynamo.  

There was a desire to capture all of the valuable data collected as part of the audit, just running the audit as an issued state check is very useful but once the Excel spreadsheet used to present and record the results is closed all of the data is lost, furthermore there is additional data in the model which would be useful to capture. 

For example, consider comparing the data for the number of sheets in a model, a fairly broad measure of project complexity and the data showing hours worked from a company time-sheet database. This would allow for more accurate costing on similar future projects.  There are countless similar metrics and with the advent of machine learning upon us capturing vast amounts of data now has never been more important.  We don't yet know what uses there will be for this in years to come. 

In order to record the data it's necessary to connect dynamo to a database, Dynamo has…

Revit CAD Export Layers to Uniclass 2015

As part of my organisations move towards BIM level 2 i needed to address the way our layers were named when we export CAD files from our Revit models.  Firstly, nobody has ever complained that our layer names were not Uniclass 2015 standard, however, our internal layer naming system seemed to have no standard basis and i wanted a suitable standard for drawings we were producing purely in CAD.

The AEC have produced a comprehensive document which explains how to name layers in AutoCAD, for those interested this can be found here

So... on to the revit export update, i thought, this will be easy, i'll simply visit, download the .txt file for use in Revit and simply adapt the lineweights to suit our standard and i'll be done!  However, despite an extensive web search i was unable to locate a standard txt file conversion for Revit to Uniclass 2015 la…