Blog article
See all stories »

Mark-to-data?

The American Securitization Forum’s (ASF) work to enable the tracking of individual US mortgages   is just one example of the finance world’s renewed faith in fact. This in turn is paving the way for a new ‘mark-to-data’ standard.

Toxic assets were originally decoupled from reality and marked-to-models because the industry assumed that there was simply too much data to compute. This misconception meant that data about real world events – e.g. subprime borrowing, unstable collateral, defaults – collected dust until a news organisation or government agency came along and included it into socio-demographic averages and macro-economic trends. Essentially, yesterday’s facts were being turned into today’s assumptions and when the differences between the two were bigger than expected, the results were fatal.

Those who still believe that there’s too much data for valuation need to remember that it’s 2009. There is a tremendous amount of intellectual and technological firepower available to compute fact-based assessments of value and risk and make “mark to data” a possibility. At the same time, industry efforts like the ASF’s mortgage codes will standardize essential facts and make all this info more comprehendible and readily available.

The times demand radical data availability and transparency. We all need more data, and better data to look beyond the markets and academia for a new consensus on valuation best practices.

4664

Comments: (0)

Member since

0

Location

0

More from member

This post is from a series of posts in the group:

Data Management 101

A community blog about data and how to manage it


See all

Now hiring