**Model Behavior**

*High-frequency trading has the potential to make markets more volatile than ever. Federico Bandi uses complex mathematical models to find reason—and to predict behavior—amid the chaos. By Mat Edelson*

**Call**it a moment of monetary mayhem that, like a partial nuclear meltdown, could have wrought unimaginable devastation had it run fully amok. On May 6 of last year, at 2:41 p.m., Wall Street’s financial control rods failed, sending the market into free fall. Due to frenzied computer trading, in 300 seconds the Dow Jones average dropped 600 points.

It
was a full-out binary panic, the “flash crash” of 2010. Shocked traders watched
helplessly as the machines they depended upon unexpectedly took Wall Street to
the brink of chaos. Later reports in the

*New York Times*determined that a single trade made in Kansas began this electronic financial fission. It was powered at the trading equivalent of light speed by so-called high-frequency transactions—computerized deals that occurred in microseconds, faster than any humans can reckon.
Such
voluminous dealing always creates “noise” in the markets, but in this case the
electronic forces—the SEC later found that 140,000 individual contracts had cascaded
back and forth in three minutes, culminating in a frenzy of 27,000 trades in 14
seconds—had turned into runaway sell orders that threatened the whole system. The
suddenly seismic volatility initiated an automated but uncoordinated set of
checks and balances that sent stock prices crashing; according to the

*Times*, a share of Procter and Gamble could be momentarily had for 1 cent. Only another computerized signal coming out of the Chicago Mercantile Exchange created the financial equivalent of throwing boron on the pile: For five precious seconds the cascade was somehow interrupted, long enough in supercomputer time for the cosmic reboot button to kick in and halt the skid.
Within
20 minutes, the Dow had fully recovered the losses sustained in the flash crash,
but those who ran the markets were badly shaken. More than 20,000 trades
involving 300 securities had to be retroactively cancelled after the close of
trading because of their obviously erroneous pricing caused by the crash. The
market makers had been jolted into the reality that high-frequency trading was
a more powerful—and potentially dangerous—tool than even they had imagined, rapidly
implemented but often poorly understood. By their nature, these very trades energize
(and perhaps overheat) the markets, creating volumes—and volatilities—unheard
of just a generation ago. Even after the market recovered from the flash crash,
it went on to lose more than 3 percent on that day, and still no one is quite
sure what role the mini-meltdown played in the day’s final tally.

That’s
the kind of black hole that market players up and down the line could no longer
tolerate, where for unknown reasons the very foundation of every decision they
make—the validity of the price of a commodity—could suddenly be called into
question. What they needed were whiz-bang types who could get inside the high-frequency
trading mechanism, look at the data it created, make sense of the seemingly
indecipherable, and help create models in which volatility and market stability
could peacefully coexist.

What
they needed was Federico Bandi.

***

The soft-spoken Bandi arrived at the Johns Hopkins
Carey Business School with research that built on the work of New York
University’s Robert Engle. Engle's work on market volatilities in the 1980s became the underpinning for many forecasting
models, leading to Engle’s Nobel Memorial Prize in Economic Sciences in 2003. What
Engle began, Bandi and perhaps a dozen colleagues around the world have built upon.
Think of it as Financial Econometrics 2.0: the use of sophisticated mathematical
models to predict how individual financial products and overall markets will
behave over time. The field has numerous practical applications. For example, financial
econometricians seek a financial instrument’s “true” value, such as the proper interest
rate on a bond or price of a stock. In regard to a stock price, that could mean
creating mathematical models that separate a company’s real assets—the core of
stock valuation—from the distortion of temporary pressures, such as a large
sale by a major investor in need of cash, that can spook the market but may
have little to do with the company’s actual performance. Everyone from hedge fund
managers to CEOs to national fiscal policymakers now keep an eye on the discipline,
looking to glean from financial econometricians better ways to conduct their
business.

When
Engle was doing his seminal work—defining the relationship between what occurs
over a period of time (aka time-series measurements) and the up-and-down
movements of a given market (aka volatility)—financial econometrics was in its
infancy, Intel 80286 computers were state-of-the-art, and the focus was on low-frequency
trading and financial forecasts measured in days or months. The work was a
little bit like figuring out the languid movement of a planet across the
heavens versus the frenetic motion of an electron around an atom. The jump to
understanding high-frequency trading became necessary as computing power
exploded. Newer, faster trading models made their way to the markets, in no small
part because each trade meant additional monies to brokers. To put it in
perspective, the New York Stock Exchange’s trading volume surged from a high of
100 million shares on a single day in 1982 to 10 times

*that in 1997, to roughly 2.6 billion shares traded on the day of the flash crash. Some estimates say that perhaps 60 percent of all stock shares currently traded on the NYSE and NASDAQ involve high-frequency computer algorithms. One market watchdog, Tabb Group, estimates that firms involved in such trading earned nearly $13 billion dollars during the last two years of their high-frequency transactions.*
Any
way you look at it, that’s a lot of market noise. Fortunately, the same computers
that make the mind-boggling trading volume possible also allow researchers such
as Bandi to make some sense of the cacophony. By bringing to bear his economic intuition
and mathematical acumen, Bandi works and reworks what to him is an irresistible
exercise: Look at all trades across a given time-span, distinguish meaningful
trends from flukes, represent those trends mathematically, account for the
right amount of volatility, and see if his models hold true when applied to
future trading. What he does is the equivalent of a tourist getting on a packed,
noisy subway car at rush hour and amidst a thousand conversations hearing the
only one that matters—where the train is headed. By better understanding asset
volatility, the markets—and by implication, their computerized trading programs—might
be made more impervious to trading shocks that threaten to destabilize them.

***

Sitting in
his 13th-floor office at Carey, one sheet of glass comfortably separating him
from an oppressive summer morning in the Baltimore neighborhood of Harbor East,
Bandi is asked what he tells strangers who lead with that ever-invasive opening
salvo: “So, what do you do for a living?”

“I
tell them I write mathematical models,” he says, breaking into a big smile, “and
that usually ends the conversation.”

One
can understand, with papers titled “Microstructure Noise, Realized Variance,
and Optimal Sampling,” “Time-Varying Leverage Effects,” and “Fully Nonparametric
Estimation of Scalar Diffusion Models” that Bandi’s work seems inaccessible to
an outsider. Yet he insists that higher math aside, it’s not.

“The
idea is relatively simple,” he says in thoughtful, measured tones. “There are
quantities of items that people are interested in and talk about all the time,
like inflation rates, interest rates, and stock returns. You can actually write
down mathematical expressions that tell you rather clearly how these objects
move around and evolve as time goes by.” Each increment in accuracy can mean
the difference between a 401(k) in crisis or a flush retirement fund, or trading
algorithms that are more finely tuned to prevent overreaction and massive
selloffs.

What
makes Bandi unique in his field is that he’s a bit of an intellectual estuary;
where many of his contemporaries stick to either the theoretical or the applied
side of econometrics, Bandi flows seamlessly between the two. As such he finds
himself in a unique position, not unlike Oppenheimer at Los Alamos: a respected
leader who converses, based on his own bona fides, with both fundamental and
applied researchers, uniting them to move the field forward. (Bandi notes that Oppenheimer’s
ilk—physicists—are among the dizzying array of talents drawn from many disciplines who are bringing their mathematical skills to bear on financial
econometrics.)

In
the classroom, Bandi's verbal and mental dexterity appeals to students, who
consistently gave him outstanding marks for his “clear and interesting
delivery” of executive, MBA, and PhD course materials while teaching at
University of Chicago’s Booth School of Business from 1999 to 2009.

But it’s
the challenge of making hard predictions more predictable that drives Bandi. To
hear his passion for his field is to witness someone at the intersection of art
and science. As a youngster in Milan, he went to what he calls “a very
scientific high school” that first exposed him to advanced math, which became
his Rosetta stone as he dug into economics at Yale, where he received his
master’s and doctorate. He uses interest rates as an example of applying math
to a conundrum.

Economists believe that interest rates always rise and fall
over time. Mathematicians often use the idea of

*stationarity*—a mean number around which a set of data hovers—to predict trading trends. So, for example, if your interest rate on a given day is way above the stationarity point, at some point it should begin to decline toward the mean. But what do you do with that concept when interest rates plunge, and then keep on plunging with no expected cyclical rise, as they have steadily since the 1980s? Now how do you predict where they’re headed?
“We weren’t
seeing much reversion to the mean, and if things aren’t stationary and you’re
making long-term predictions, [those predictions] become much weaker,” says
Bandi. “That made me think about alternative ways to model interest rates. And
I realized there was a friction between what I was observing and my intuition
as an economist. At the end of the day, it was hard to believe there wasn’t
some stationarity in any interest rate series, that they wouldn’t get back to
some level—they’re not going to go down to zero and be stuck at zero forever. Yet
the data didn’t account for stationarity, so how do you model that?”

His
answer, in a 2002

*Journal of Financial Economics*paper titled “Short-Term Interest Rate Dynamics: A Spatial Approach,”*employed new methods to work around the stationarity issue and provide a new tool for predicting interest rates. Bandi says these short-term rates are the kind of “underlying object” that for financial product managers play a huge role in “models that are used to price bonds and fixed-income securities.”*
This
ebb and flow between observed problem and mathematical solutions—or at least
propositions—is for Bandi like running the rapids: constantly churning and more
than a bit of a rush. When asked where he likes to work, he waves a dismissive
hand at his computer screen. “That’s where I answer my emails. The tedious
stuff. I like to work in unusual circumstances. Some people find it easier to
write down everything on a piece of paper. I don’t do that.” He prefers to hop
into his car, flip on anything from classical music to salsa, and think while
he drives. Or take a long walk around his D.C. neighborhood and think some
more. Or go to sleep... and think still more. And somewhere in all that
cogitating—maybe it’s the stimulation of Mozart and Frankie Ruiz—bingo!

“Being
in those everyday situations helps the process because it stays with you,
you’re sort of metabolizing it a lot better,” says Bandi. “I’ve had multiple
occasions when I’ve gone to bed, thought about something I’m working on, and woken
up in the morning with a reasonable solution. There’s nothing magical about it;
it’s just an indication that your brain keeps on working.”

While
he acknowledges that many respectable authors fall in love with a theory and
then find a problem to wrap it around, “that doesn’t suit me. I don’t just work
on a model for the sake of it. Usually I go in the opposite direction. I look
at the data. I think about the data. I think about a problem that has an
applied nature, and the model is the way to address the problem I observe in
the data. It may be a strongly theoretical solution, but always to an applied
problem. I think that’s the intellectually correct way to go about a problem.”

Some
of Bandi’s latest work deals with a new movement in financial data sifting:
nonparametric versus parametric models. The latter builds basic assumptions or
parameters into the model. For example, instead of looking at every stock trade
on a given day, the parametric approach limits the forecasting model to
companies that trade between $30 and $50 per share. That’s been the traditional
approach. By contrast, nonparametric models essentially reverse-engineer the
problem, saying, “Let’s look at

*all*of those trades without constraint to see if we can find patterns in all that data and build predictive models off of that.” The advantage of the nonparametric approach is that it could overcome faulty assumptions built into parametric models.
To use a baseball analogy, a general manager using a parametric approach
might assume that, in a given year, the best hitters in the league are first
basemen; he would then ask his computer to pull the batting averages of the top
three first basemen so he can decide which one to trade for. But a GM using a
nonparametric approach with no assumptions, by looking at

*every*hit by*every*player in the league, might discover that it’s actually center fielders who are doing the best hitting, and then build a model that figures out which center fielder to target. When applied to high-frequency trading and volatility, nonparametric modeling might take preconceived notions out of the picture, allowing the data to better express its own true tendencies.
With
Bandi’s growing recognition within his own field—his papers have been cited in
economics journals more than 1,000 times—there’s a sense that his influence is
being felt beyond the walls of academia. Former Carey professor Celso Brunetti,
who now works at the Federal Reserve, calls Bandi an “elegant” mathematician
whose work is well-known by people at the Fed. “One of his main contributions
is he basically developed the theory

*and*the application of how to measure the volatility of financial assets,” says Brunetti. “This is a major breakthrough, extremely important for policymakers when they have to regulate markets; for CEOs running a company who want to know the volatility of crude oil, for example; and for portfolio managers because they need to compute how risky is their portfolio.”
And
while the

*Wall Street Journal*and other mainstream consumer publications have yet to write about Bandi’s work, Brunetti and others suggest that players in the biggest financial houses are among the cognoscenti. That includes Michelle Yang, a financial engineer in Moody’s Credit Policy Department who says she used Bandi’s papers to help develop trading strategies involving equities derivatives while she was at Merrill Lynch. “[His work] definitely was one of the things we looked at,” she says. “My boss at Merrill knew Federico, and, at Moody’s, my boss’s boss, he knows [Federico’s] work.”
Bandi’s knowledge is also being sought out directly
by market makers as he’s lectured to business leaders in New York, Europe, and
Asia. Fellow financial econometrician Torben Andersen of Northwestern
University’s Kellogg School of Management isn’t surprised, putting Bandi’s
versatility in baseball terms (must be a summertime Chicago thing): “He
certainly hits with power to all fields, making his theories relevant to
financiers and economists with applications that can easily reach into Wall
Street for sure, such as characterizing liquidity.”

Andersen, who just wrote
about the flash crash for the Social Science Research Network, says Bandi’s
concepts could help the markets bolster themselves against future incidents,
which, given Wall Street’s recent debt ceiling–induced roller coaster, can only
be seen as a good thing. Bandi himself notes that when markets start going
nuclear, “big spikes in volatility occur precisely when you have big spikes in
asset prices,” and it’s an inverse relationship, meaning when markets overheat,
prices plunge like, well, nuclear winter. “It’s what I’m working on right now,
figuring out the relationship between the changes in stock prices and changes
in volatility. It’s not easy.”

No,
but it may be the only thing standing between the markets and more meltdowns.

*Mat Edelson is a Baltimore-based freelance writer whose feature work has previously appeared in*ONE

*.*