BIM is synonymous with construction. Graeme Forbes, Managing Director, Clearbox explains why a data-centric approach is superior to a file-based process
Like it or hate it BIM is here to stay! Although no one said it would be easy, I am sure many of us who are in this space would not have thought it would have been this hard.
There are two differing approaches to BIM – managing projects at a file level and managing them at a data level. Why is that important? Well, while we have always preferred the latter, it is interesting to note that some key vendors in the market are now moving from their more traditional file-based approaches to that way of thinking as well.
There are undoubtedly advantages and disadvantages of each, lessons learnt and examples from other industries, plus the harsh reality of experience in AEC, which is now driving a fundamental shift for the future.
BIM has developed from a design-centric view on life – authoring tools have been extremely flexible to allow users to work the way they want to, to be able to express themselves and be in control of how the information was defined. Historically, we have always worked in silos, passing the information needed by the next person in the chain in files.
As such, the technology that has been developed has matched that manual way of working, an electronic mirror of the old way of silos.
The PAS 1192 process arrived to encourage us to collaborate. Some authoring tools helped us to define and classify the data and since the data has become a more prominent feature, enabling the scalability and predictability of processes through analysis – the need to manage it has helped to make some authoring tools more popular.
For a short while, model files were expected to contain everything and federation of the models into one BIM became the norm in our effort to see everything together. While we could often see the model made up of the objects, the data hanging off the back of the objects was, and generally, remains inconsistent. This has led to the files becoming bloated, and in order to manage the data, you need to know how to use the authoring tools. Therefore bottlenecks in the process arise.
The flexibility in the tools, while essential to the software provider, then becomes a nuisance to the user or more particularly the business/project. It is not just a case of configuration and ideally being able to lock down the configuration, but when you can’t, it then becomes an issue of finding what information is not aligned.
Compounding the complexity is the fact that once you set a model architecture, it follows all the way through into construction. Many models often need to be accessed to see and address one issue – requiring specialist skills – and when you are working on the information, it is imperative that others aren’t editing the same information, even if they had the contractual rights to undertake the edit.
So, we often see models get duplicated so that each party in the downstream process can make their own updates. However awkward the data update process practically is, all works well, at least relatively, until the original model changes. We had one contractor spend time with us who showed us 33 slides to describe the process they adopted. Battle-hardened as we now are, even we were surprised by the multitude of tools and the number of manual steps that needed to be taken, each one, of course, needing some manual file manipulation to enable the exchange. Of course, if you received a new design file then large parts of the process needed to be repeated. No wonder so many people don’t start until the preceding party has finished. This is one of the reasons collaboration and a real improvement in efficiency is hard to find.
Why do we think data-centric is better?
In a data-centric process, the federation is (obviously) around the data and so, to perform any tasks, questions are asked of that data. Packets of data can be checked out to edit and, in the right tool for the right purpose, all sorts of issues such as the tracking of ownership, access and security can easily be overcome. Structuring the information around the data in this way, also allows it to be made consistent and therefore makes it more usable to the process.
Our problem was that the shift in approach and the benefits and power of integration made many tools the industry had grown to use too difficult to deploy, not least because the scale of the information we now handle makes it difficult to operate the traditional tools. However, now that we can cut an IFC across the data, subsets of information can easily be processed in other tools if the user prefers to use a fragmented toolset.
Of course, in a central, data-centric process other opportunities arise. When I carry out processes such as RFIs or clash detection, I can not only store the output, a form or a report in the database, but I can store the particulars, the view position, the objects the RFI refers to etc. etc. While that makes it easy to return to the subject and check that what I asked for has happened, it also makes it easy to run analytics and reports. Best of all it is real time and can be derived without having to ask someone to compile a report – standard reports driven automatically from the data. Of course, the data is all visible as it is no longer buried in a file but is instead readable and interrogatable from a database which can be served with web access. This can allow users to write their own APIs to get hold of the information they particularly need, and since the information is held and coded at an object level, aggregating data in each and every direction is a simple industry standard task.
While we never believed it would be easy, we do know that in the long run, it is easier to work from a data-centric approach. Indeed we are convinced that the savings the industry needs and the ability to attract new blood to sustain the future of this core industry will rely on modern tools and approaches, many of which in the era of the smartphone we have now become used to.
While several businesses are now engaged in data-centric tools it is perhaps the turn of the largest providers of authoring tools into a data-centric approach that we can see that the future transformation to a truly connected and integrated environment is assured. Unfortunately, while some will inevitably wait to get the benefits before no doubt re-purchasing complex suites of interconnected tools, the Clearbox tools address the same end game in keeping with the foreseeable and current procurement arrangements that define the AEC industry.
Please note: this is a commercial profile