Thursday, May 19, 2016

UniPhi 12 - Cash Flow Adjustments of Time

The ability to instantly create reliable cash flows from your cost budget phased according to the anticipated project duration is a great feature of UniPhi 12. At the press of a button, cost managers can know and actually see with a high degree of certainty the expected rate at which their project will draw down on their funds or earn their value. As per this blog post, we have even built an algorithm which phases your costs in an s-curve manner according to the earned value profile of your closed projects.


But what happens when there are date changes that impact the commencement, or which occur during the construction phase of your project? Well, it just so happens that we have solved that little issue too with the release of UniPhi 12.

Introducing the project cost auto phasing Adjust Start option.


As its name suggests, this feature allows you to push out, or draw back the phased cash flow for your projects. Key here is the cash flow profile stays the same, just its start and end date change. Although as a feature this is not as revolutionary as some of the other recent features that are targeted at those in the construction and cost management industry, it IS an improvement that just makes life easier.

It's worth mentioning that this enhancement was made possible due to the feedback that we have received from you, our clients. As you know our intention is to always build on the success and innovative features of UniPhi so that it is increasing efficiency, and making life easier for our user community.  We develop our software in an iterative manner (sprints would be the latest term for it but having been using this method since 2003, we'll use iterations - kind of like how we still call it hosted instead of cloud). This method allows us to incrementally improve our product and occasionally, a small change like the one above leads to something big....Watch this space!

On that note, if you have a process that is causing you or your company inconvenience, or that you think down right annoying, why not raise a new topic in our Forum.

Thursday, April 21, 2016

UniPhi 12 - New release set to disrupt the Cost Management industry - July 1 2016

UniPhi's exciting major product release, UniPhi 12 is set to change the way cost management functions in the construction industry. Significant new features and enhancements to the cost module will not only greatly reduce the time taken for cost managers to complete mundane tasks but will unlock the intellectual property stored in their estimates and post contract services. UniPhi 12 shifts the focus of cost managers from simply producing cost plans, to enabling "big data" analysis at the click of a button. The new and efficient cost management features outlined in our recent 4 part blog series on benchmarking allow cost managers to reduce time to compile reports and provides access to previously buried data allowing the cost manager to offer more nuanced consulting advice. It's an innovative change and has the potential to cause a massive disruption across the construction industry. Are you ready?



The majority of new features included in UniPhi 12 are targeted squarely at those in the construction cost management industry. Over the coming weeks we will be writing individual blogs which explain and demonstrate the key features which are set to transform the construction industry. Those features include.The key features driving this change are:

In addition to these key disruptive features are features aimed at improving efficiencies including of a cost managers services bread an butter services including
  • Flexibility to cost coding
  • Expanded report writing capability for cost plan reports and post contract reporting
  • Enhanced provision sums and EOT tracking
  • Enhanced cash flow management capability
  • Derivatives for budget items and for bank guarantees


In the lead up to the official release, we are seeking feedback from clients who would like to get in early and upgrade ahead of the scheduled July 1 launch. Your valuable feedback will assist us in further refining and enhancing the features. The benefits for you will be that you can have direct input to the final release of UniPhi, and as an early adopter you will reap the efficiency benefits sooner.

The process for upgrading your deployment is straight forward - just get in touch with us at (info@uniphi.com.au) and specify a date and time that works for your organisation and we will take care of the rest. The upgrade to your deployment will require an outage period which can occur at any time that suits you including outside of work hours.

What are the issues around cost management that caused the greatest angst, or where the most amount of time is spent? Let us know by commenting on this blog, or start a discussion in our forum. We're interested to hear your hurdles, and will gladly explain how our new features can simplify things for you and your cost mangers.

Thursday, April 07, 2016

Benchmark Series IV - Using Benchmarks in Estimating

The third post in the Benchmark series looked at outputs that can be achieved once robust and quality assured data is being captured into a system that has been designed to consolidate and present the learnings from this information. It is possible, however, to do more than just re-purpose benchmark information for sense checking and comparative explanation. It is possible to generate new information. Two game changing possibilities in using benchmark information are in the generation of new estimates in project costs and in the phasing of these costs over time.

The goal of building parametric models that use benchmark averages and key design parameters to generate estimates is in providing decision makers with quick turnaround information that drives investment decision making. Property developers looking at new greenfield sites or the re-development or refurbishment of brownfield sites and get a quick cost comparison between these ideas or opportunities. And by quick we're talking minutes.

Our involvement on the Global Unite project at AECOM has demonstrated that the use of large data sets to drive robust averages can lead to remarkably accurate early stage estimates (plus or minus 10%). To clarify, large data sets are those greater than 100 projects with cost data that has been verified. The way these models work is that there is a key driver for each element within the estimate (typically floor rate but it could be elemental rate for some elements based off benchmark ratios for quantities). In this way, entering the likely floor area of the development and then tweaking certain elements for the design parameters of the site (e.g. substructure complexity, hydraulics and electrical requirements etc), a knowledgeable cost manager can quickly provide a project cost estimate in minutes instead of weeks.

Critical success factors to this parametric model are:

1) A large enough data set to generate a true benchmark average. Many estimators worry about the outliers of a data set but it is the outliers that make an average a true average, cancelling out each other and providing the standard deviation that assists in tweaking the estimate to the specific vagaries as they are known at the time of estimating.

2) A data set that includes elemental quantities and hence benchmark ratios for things like wall to floor or windows to floor.

3) Meta data classification that allows the end user to reduce the global data set to a sample of similar projects that fit the current characteristics of the early stage investment idea. These down have to be too specific but at least include the sector, project type, asset type work type, location and estimate date

4) Solid indexing data to translate cost information relating to one location and at a certain point in time to a different location at a future point in time

These four things are where cost consultancies can differentiate as their past experience and knowledge drive their capabilities to provide parametric estimates.

The same requirements exist for the other key new knowledge gained from benchmarking, that of cash flow phasing. Utilising the cost phasing profiles of projects that have actually happen is key to getting an accurate estimate of how funds will need to be drawn down on a new investment.

A significant advantage in the cash flow phasing capability is that the value of past projects is irrelevant (hence no need to index costs), what we're interested in is the way funds were spent (i.e. what % of total project cost was spent in each month from start to finish). This profile will be accurate again for like investments and again, providing a pool of 100 projects or more evens out the poor performing with the good to give an accurate benchmarked phasing profile.The simplicity and elegance of this solution is so strong, it is surprising that no-one (to our knowledge) is utilising it in the market to date.



More on cash flow phasing using benchmarked data can be found on our recent blog and in the demonstration of using this feature in UniPhi's product suite demonstrated at our recent Webinar on the topic above.

Friday, March 11, 2016

Benchmark Series III - Utilising Data Captured

So far in this series I've spent most of the time describing the process and critical success factors to capturing good benchmarking data. The end of the last blog however commenced the discussion on how to use this information. Using information stored in databases is often harder than you'd think. Some systems are great at designing efficient ways to consume data but then spend very little time on ways to view what has been captured. When the goal is benchmarking data, the outputs are the most important functional units.



Outputs vary dramatically depending on the users perspective of benchmarking and what they want to use it for. Below is a list of examples that we have encountered over the past four years (how many could you lay your hands on in less than 5 minutes):

  1. Show me the average elemental rates for particular sectors, project types, asset types and work types for a particular location or across locations
  2. Using the same function, show me the average trade rates over the past 12 months in a specific location to update a rate library in a cost planning tool
  3. Display average USD/SF GFA rates as a bar chart for particular sectors, project types, asset types and work types for a particular location or across locations
  4. Map the range of wall to floor ratios for selected filters
  5. List architects from most expensive to cheapest for average new builds in commercial office towers in London using a £/SF rate of GFA for construction costs.
  6. List the top 10 most expensive museums on a USD/SF of GFA basis across the US in the past 10 years. Show me their design characteristics and functional units (e.g. excavation required, area per floor, no. of levels, no. of rooms etc)
  7. Show on a map where my projects have been and change the size of the circle to reflect the cost of the project
  8. What's the average time to reach final account on a commercial new build project costing more than $20m and lasting longer the 6 months?
  9. Using past projects, price a new project for me in this location for a new build industrial factory with 10,000 sqm of floor space.
  10. Utilise past projects to phase my new one

And the list goes on...

One of the difficulties in developing systems and processes to cater for all these types of questions is in the design of the report engine. Some of the questions are impossible to answer via a generic report interface (e.g. the parametric pricing model used for number 9 above). But most can be answered via the design of a simple to use data analytic tool and a well designed business intelligence cube. We've used the following interface options to present data captured into our BI cube:


A possible alternative to the bespoke web applications, which can be expensive to build, maintain and expand on would be the use of SharePoint's integration with Analysis Services (which is what our BI cube is built on).



I guess the biggest lesson learnt from this space is that you should get as long a list of desired outputs as possible as soon as possible to assist in preparing an overarching architecture that can support what will ultimately be a variety of interfaces into the data.


If you're interested in how UniPhi's software can help you benchmark and get the outputs you want contact us

Monday, March 07, 2016

UniPhi Feature - S-Curve Phasing Algorithm

Among the problems faced by cost managers when attempting to produce a reliable cash flow is the disparate nature of the information they rely on (estimation tools, excel spreadsheets, emails, document versions, etc.). Once a cash flow is created, it requires further analysis to ensure data accuracy and validity, all of which takes yet more time. Up until recently UniPhi allowed an estimate to be phased in a linear manner over a period of time, or manually to fit the Gantt. While working closely with a number of our clients we realised that there was no recent software system that allows for automated 'S' curve phasing.  An 'S' curve is the shape a cash flow typically takes when profiled on an XY diagram due to the fact that projects generally start slow, get busy in the middle, and tail off during practical completion.




At UniPhi we are always looking at ways to improve our product, and thus differentiate ourselves from other software applications in the marketplace. With the goal of improving cash flow phasing in mind, we took a step back and analysed the available data that was stored in UniPhi. The core design structure of UniPhi means that data entered once is available in many places. One example of this data that is stored within UniPhi is the progressive sequence of payments made against a contract. This information tells us the actual dates that payments were made, and their individual values. This means that our clients build up 'S' curve profiles on actual projects as time goes by.

Our development team realised they could use this "Actual" phasing to create an automated benchmark algorithm. The most impressive thing about the new benchmark feature and its underlying algorithm is that your UniPhi users are automatically generating the benchmark data via the contract admin function that they've been using for years.

We completed a preliminary implementation of the "S" curve functionality to one of our clients, and found that when compared with their old process of manual calculations, UniPhi's cash flow phasing and associated "S" curve were produced to great satisfaction, and accuracy. Not to mention being much faster than the old method. Through this process of collaboration and review, we discovered that there were additional levels of nuance that were observed and could also be leveraged. Therefore we added a productivity column. The purpose of the productivity column is to factor in periods of low productivity, e.g. Christmas, New Year, and public holidays. The concept of factoring in these periods of lower productivity is powerful, but our design makes the configuration very simple to configure and update.
Adding your organisations productivity calculation is simple


I recently demonstrated the ability to phase costs using benchmark actual data to a client, and his initial response was that because each project type is subtly different, the benchmark data would also be subtly different (e.g. building a high school would be different to building a hospital). The good news is that UniPhi also allows you to select the distinct set of data you need, according to your own criteria. UniPhi has always empowered our clients to perform  administration and configuration tasks through flexible design interfaces, and this capability is extended to allow you to create your own project custom fields. If for example you have previously used UniPhi to manage several constructions projects of varying size, UniPhi allows you to create a project field and specify the "number of floors" that were built for the project. Then, when you are tasked with managing another smaller sized single floor project, you can selectively filter your benchmark data so that only smaller sized 1 to 2 floors projects were displayed. No other system has this level of functionality.


With this new benchmark cost phasing, your own data becomes your most valuable asset. By simply referencing the progress payments which have been made in your previously completed projects, you can get a reliable forecast of costs for your current project.

As UniPhi also features integrated contract and document management modules, our clients have an end to end solution for managing projects. Estimates can be produced in UniPhi, or imported from a third party product, costs can be phased according to your organisations own specific benchmark data, and then once awarded the costs can be managed with accuracy ad transparency. Because your project has relied on benchmark data, you will have confidence that the phasing is correct. In fact you can take it to the bank!

Thursday, March 03, 2016

Benchmark Series II - Control Quantities

The solution

See how UniPhi can capture benchmark ratios from elemental quantities in this 1min 30sec video


Calculation and use

Control quantities are ratios between one element of a building or construction project and another. They can be used to measure a whole variety of things including design efficiencies (for example the wall to floor ratio), cost drivers (preliminaries:construction costs), Density (floor area ratios) and thermal performance (windows to floor ratio).

Graphical display of relationship between floor area and wall to floor ratio

The good thing about ratios and control quantities is that they can be compared on a global basis without worrying about inflation and currency differences. When benchmarking globally, the ability to compare design parameters globally for similar types of projects can greatly assist multi-national investors and perhaps even drive change in various countries to align to what might be better practice. Without a global benchmark of control quantity information, discovering these discrepancies or variations is impossible.

But the use of control quantities doesn't have to be limited just to global benchmarking. Typically, these ratios are known by cost managers and calculating them for each estimate provides a sense check as to the validity of the elements in the estimate. The issue here is the time taken to do this important task. Key items that can take time to be able to complete sense checks using control quantities are:

Problems and solutions to obtaining and using control quantities

In our experience, there are four problems an organisation has to resolve to be able to obtain and use control quantities in their cost planning:

  1. Estimating to a standard elemental level (The NPWC coding structure for example)
  2. Aggregating quantities to the elemental level
  3. Consistently applying the range of formulas to the model
  4. Aggregating multiple similar projects to compare the current estimate to the benchmark

Standard elemental structures

Standards are the drivers for many things. Many tech hardware and software devices would cease to exist without industry standards being used and relied upon in their build. The same goes for cost benchmarking. Without standards it becomes impossible to benchmark and without benchmarks, control quantities have no comparator to give them a use. So key to generating a successful benchmarking database is to get estimators to compile budgets against a standard elemental structure and for those estimates to have both elemental quantities as well as monetary totals.

Part of the NPWC industry standard elemental structure for buildings
Inhibitors to creating a standard are clients of estimating services wanting the estimate to fit their own bespoke template, recalcitrant estimators who only see the value in their single estimate, not in the flow on benefits of capturing this and comparing it against others and organisations actually agreeing to what the standard should be.

Solutions to each of these problems are selling the benefits of benchmarking and control quantities analysis to these clients (see 5D BIM discussion below), transparently displaying information about who is generating good benchmarking information and rewarding staff on this type of performance and adopting an industry standard as your standard for estimating (e.g. NPWC, POMI, NRM etc).

Elemental quantities

Problem number 2 can be easily solved using many estimating software products that exist in the marketplace that have key functionality to allow for quantity inheritance. Two that come to mind are CostX by Exactal and Cato by Causeway. Both systems allow you to pick which detailed line item to inherit when rolling up costs from the detailed to the elemental level. For example, the wall area qty would be including underneath the wall element "external walls (excl windows)" many times (perhaps as a subset) at the detailed level with different rates applied, by marking one line or a combination of line items totalling the wall area to inherit when summing the total external wall cost, our aggregated quantity problem has disappeared.

Elements with Quantities

Time taken to calculate the ratios

Once our estimates are being compiled at a standard format and quantities are being aggregated correctly to the elemental level, the next step is to make sure we calculate each ratio correctly and present this information to reviewers of cost estimates in an integrated fashion. Many organisations use pen, paper and a calculator to complete the sense check calculations of an estimate. This is obviously prone to error and does not provide the reviewer with any transparency that the calculation has been completed correctly.

Key metrics dynamically captured and classified via cost plan import
UniPhi solves this through its calculated metrics functionality. Here, a system administrator can use the codes in the elemental structure to define formulas that get calculated either on import from Cato, CostX or Excel or manually keyed into the UniPhi cost module. The documents module is then used to integrate the display of estimate, control quantities, benchmark ratios and design documentation to a reviewer.

See how we do this end to end via our you tube videos capturing calculated metrics.

Setting up the key metrics


Using these metrics to calculate benchmark ratios


Finding similar estimates to aggregate and compare

The final and most important issue that basically makes all the other solutions moot is the disparate storage of these estimates. Many organisations will email colleagues asking for excel spreadsheets of similar estimates waiting hours or days for the data to be forthcoming and when received having to then resolve the three issues above before being able to manually aggregate them and use the averages.

The portfolio aspects of UniPhi can aggregate an average of all projects across the portfolio that meet the characteristics of this particular project. This provides the necessary benchmark number to compare the current estimate to. This comparison provides for a wealth of information for the cost manager to bring to the table for any value management or cost saving exercise that might ensue.

Graphical display of relationship between floor area and wall to floor ratio for similar projects

Cost estimators response to the threat of 5D BIM

It is this last piece of the puzzle that opens a door of opportunity to cost estimators be they internal estimators for contractors or external consultants for clients of construction services. The time taken to manually measure drawings to derive estimates has been reduced through the development of BIM software that can allow a user to measure through clicks on a screen and then adjust this when a drawing is adjusted by an Architect. However, by capturing the data in a structured manner and using the competitive advantage of having pools of information on previous projects to provide insights to the current design, an estimator can actively participate in the design conversation.

This value add removes the commodity type nature of the industries work and lifts cost consultancy up into the realms of management consultancy as the integrators of design and engineering parameters to make sure a cost effective solution that meets the functional and service outcomes of the clients is achieved.


Please feel free to give us your feedback on the complexities and issues faced providing and using benchmark data at your organisation.

Monday, February 29, 2016

UniPhi Feature - Customer Dashboard

UniPhi's web application features a graphical Dashboards tab which presents a live and fully transparent view of your organisations project data. Among the available dashboards are Summary, Time, Submissions and Issues. Each of the UniPhi dashboards present a neat graphical summary view of the information stored in UniPhi, and are designed specifically to appeal to those of us who relate more to a graphical presentation of information than tables and data.

Based on requests and feedback from our clients we have expanded the selection of dashboards to include a new Customer Dashboard tab.

The latest view of your customer data

The Customer Dashboard presents the "Top 5" customers in terms of revenue and profit across your organisation. The dashboard contains 4 graphs which display your present top 5 customers for the quarter and over the past year with a comparison to the same time last year. 
The graphs each show:
- Top 5 Revenue in the Quarter
- Top 5 Profit in the Quarter
- Revenue by Customer
- Profit by Customer
See your most valuable customers, at a glance

Filters can be applied in order to view this valuable data through different lenses. For example you may be interested in seeing your top 5 customers across your entire portfolio, or per specific sector, project type, or by location. As you may be aware, each of the categories of information appearing in these filters is able to be defined by your UniPhi administrator so you will always have the ability to focus on the information that is most relevant to you and your organisation.


UniPhi dashboards adhere to our core design principles whereby data entered once is made available in numerous places, and in real time. This means that each time a revenue contract is created anywhere in your UniPhi deployment, the value of that contract will immediately update the customer dashboard, so you always see your top 5 as of right now. Having this information available dynamically means that you no longer need to wait for a report to be compiled, consolidated, and distributed to understand how your business its tracking today, and not last year, last month, or last week.

These are the four graphs we've developed so far but like all database driven systems, there's endless ways of aggregate customer data. Leave a comment on the blogger site if you think there are better metrics for customer information. Your thoughts will drive our next round of development.

Thursday, February 25, 2016

Benchmarking Series Part I - Data Capture and Quality Control

The Benchmarking Challenge

One of the biggest challenges for organisations when they want to benchmark is finding the relevant information in a way that is comparable across the things they're looking to benchmark. Perhaps user friendly software can help.

UniPhi's focus is on benchmarking everything cost and time related to construction projects. The range of this is almost endless and includes things like:

  • Average cost per m2/SF of floor area
  • Average elemental rates
  • Ratios between floor and wall areas
  • Revenue per net lettable area
  • Average duration of design phases
  • Average time to tender and procure 
  • Average time to achieve final account (We've gotten this down to two weeks for our construction clients)
  • Average percentage of variations to original price
  • Etc
Typically, this information is stored in spreadsheets and an email is sent around requesting people to send files for projects like XYZ. This is then consolidated and analysed by a BI team and hopefully some insights are obtained one to two weeks after the request went out.

Data Capture

The way UniPhi has approached this issue is to make sure we integrate with as many underlying cost planning tools as possible and to streamline the process of importing an estimate into our Costs module. We then link this process to something of benefit to the person importing the plan. This is usually in the shape of assisting them to generate a report to a client (either internal or external). By making their life easier we have created an incentive to actually go to the trouble of importing the data into our application.

Classification

Once the data is captured, the next challenge is to make sure it is valid and to provide enough meta data or classifications to it to allow benchmarking end users to be able to obtain a collection of similar projects to benchmark against.

Firstly the meta data. All projects create in UniPhi require the classification of four customisable items to describe them. The out of the box labels for these are:
  1. Sector
  2. Project Type
  3. Service Line
  4. Location
Users of our software can then add an unlimited number of additional classification drop downs. The typical additional classifications are:
  1. Work Type
  2. Floor Area
  3. And some sector specific functional units like number of keys or apartments, theatres, or beds.
As these are mandatory fields when creating the project, you end up having classified a project as being in a Commercial Sector, Office Tower, Sydney, New Build,30,000 Sqm and 30 levels.

Add a project description and you've got a lot of information about the project already. The key with this meta data is that it is keyed in once. Then every piece of content created against that project subsequently inherits this meta data. So when the estimator or cost manager imports a cost plan into a particular project, the sector, project type, work type and floor area is automatically associated to this new piece of content.


Quality Control

At this stage we have a cost for a project and an association for the type of project but how do we quality control the data? This is where UniPhi's documents module comes into play. The documents system aggregates information stored in the rest of the application and workflows it to relevant parties for review and sign off. Only signed off pieces of information are available in the resulting benchmark reports.

Similarly, our new benchmark algorithm (See Part 4 of this series) relies on the certified progress claims of completed projects as the basis for generating phased forecasts.

At risk of stating the obvious, the quality of your information is crucial in generating useful benchmarking results, and with some simple rules and workflows embedded in the software, we believe this is not as difficult a task as it once was.


Results

Once we get lots of projects captured in our database, benchmarking is as simple as the query interface and resulting data set of sample projects ready for comparison.


Wednesday, February 24, 2016

Benchmarking - 4 Part Series

After a bit of a hiatus, the UniPhi blog is back. I have taken over the job of writing a weekly blog entry covering key problems our software is trying to solve. Over the next 12 weeks I will publish 3 new blog series with 4 articles in each series. The 3 series will cover the following key functions of UniPhi's software:

  • Benchmarking
  • Contract Administration
  • Cost Management
These three areas have become key areas of our application for our clients and we believe our approach to these areas is unique. Probably the most successful of all has been in benchmarking where we have been able to win back to back Australian Business Awards for innovation.

It is true that most of the work in this area has been for one of our fortune 500 clients (AECOM) and they won an excellent end product called Guide that provides their clients with benchmark data as well as the ability to price new projects within seconds using some key design parameters and the law of large numbers.

However, we have recently been able to take some of the learning from this work and incorporate the ability to capture key metrics into our application that due to its portfolio nature, already had significant benchmarking capability. This will be the topic of my first blog entry to be published tomorrow.




Friday, September 04, 2015

Managing the things you can't control

We recently released a white paper focused on complex projects in the construction industry and how creating a more adaptive and agile environment can assist in managing things that are beyond your control (you can see the media release and download the paper here http://uniphi.com.au/press-release/whitepaper-managing-the-things-you-cant-control/).

What's been interesting since  is the number of articles in the mainstream media of a similar nature. What's even more interesting is that most of the commentary is how there's nothing you can do to influence a companies reaction to things beyond your control. One example was the Qantas results. Matt O'Sullivan wrote an excellent article where he quotes ex Qantas economist Tony Webber as saying in relation to Qantas' big turnaround in profitability "You can't criticise him but I still think that a lot of it has to do with things completely beyond his control – most of it, in fact"

Read more: http://www.smh.com.au/business/aviation/who-won-the-war-between-qantas-alan-joyce-and-virgins-john-borghetti-20150821-gj4n52.html#ixzz3kievyMhV
Follow us: @smh on Twitter | sydneymorningherald on Facebook

Our view is that the best CEOs manage the things beyond their control by creating organisations that are made up of agile teams that can adapt to where the wind blows. Doing this means that if things are bad your less bad than your competitors and if things are good you're doing even better. And, we believe we've built software that can help those teams notice which way the wind blows and collaborate quicker to adapt. One key to this is transparency, which will be the topic of our next white paper due out next year.