Press Interviews and Quotes

Partly just to keep track of them for my BU annual report, this post links to my press interviews and quotes.


Effects of Brexit on the world economy and Iran.

Tasnim News Agency Interview, Iran (translated into Persian), June 28, 2016.

Farm Animal Ballot Initiative.

posted on Youtube. By  Brittany Comak, BU School of Communication. November 29, 2015

Inspector General criticizes Red Sox, BRA deal

Massachusetts Inspector General Glenn Cunha criticized a 2013 deal by Boston Redevelopment Authority that granted the Boston […]

by · October 29, 2015 · 0 comments · City, News
According to a New England Economic Partnership report released Thursday, Massachusetts is creating jobs at the fastest pace in 15 years. GRAPHIC BY KATELYN PILLEY/DAILY FREE PRESS STAFF

Mass. experiencing economic boom, study finds

In the past year, Massachusetts has seen an economic boom unlike any since the 1990s, a […]

by · October 20, 2015 · 0 comments · City, News
Massachusetts Gov. Charlie Baker signed a bill Friday authorizing $200 million in transportation funds. PHOTO BY STUX/PIXABAY

$200 million in extra funding approved for infrastructure repairs

Massachusetts Gov. Charlie Baker approved an additional $200 million in funding for infrastructure contributing to a […]

by · April 14, 2015 · 0 comments · City, News
Massachusetts lawmakers announced their support Friday for a bill that would allow the legalization and taxation of marijuana. PHOTO BY SARAH SILBIGER/DAILY FREE PRESS STAFF

Mass. lawmakers support bill that would legalize and tax marijuana

Fifteen Massachusetts lawmakers are supporting a bill, pushed by the Marijuana Policy Project, that would regulate […]

by · March 18, 2015

MBTA late-night service threatened by lack of sponsorship Daily Free Press. by Paige Smith · January 27, 2015

Part time BU employees now eligible for health, dental benefits Daily Free Press. by Rachel Legon · October 30, 2014

Student loan report shows complaints, problems with private lenders Daily Free Press by Meiling Bedard · October 21, 2014

Greater Boston GDP declining, report finds Daily Free Press. by Mina Corpuz  September 18, 2014

Small businesses get extension for ACA in Massachusetts by Daily Free Press Admin · April 27, 2014

STUDY: Grad student loan debt on the rise. by Daily Free Press Admin · March 26, 2014 

College worthwhile investment, study suggests by Daily Free Press Admin · February 26, 2014

Bitcoin ATM installed in South Station by Daily Free Press Admin · February 23, 2014

Cost of student loan programs difficult for federal government to determine, study suggests  by Daily Free Press Admin · February 4, 2014

Universities see an increase in endowments, study suggests by Daily Free Press Admin · January 29, 2014

“South Shore Hospital, Partners HealthCare defend merger plan” The Patriot Ledger. by Christian Schiavone.  1/17/2014

“The Healing Begins for”  TechNewsWorld By Erika Morphy 12/04/13 4:12 PM PT

“Financial squeeze awaits W.Pa. hospitals”, TribLive. By Alex Nixon. Thursday, Oct. 24, 2013.

Wenermaar aan, Obamacare blijft“, Trouw Buitenlandredactie. (Article on ObamaCare in Dutch Newspaper) In Dutch. October 4. 2013

Menino plans to build about 30,000 housing units by 2020. Daily Free Press, Boston University, Sep 11, 2013

“Boston welcomes startups, entrepreneurship, despite study results” Daily Free Press, Boston University, Sep 11, 2013

Mass. residents driving less since 2004, study suggests. Daily Free Press, Boston University, Sept 4, 2013.

Medical Costs Register First Decline Since 1970s. Wall Street Journal Blog.  June 18, 2013,

Employers fear economic climate, fail to make hires, new study suggests. Daily Free Press, Boston University, April 24, 2013.

Outside spending in Senate race tops $1.25 million. Daily Free Press, Boston University, April 9, 2013.

New delayed-start loan repayment plan may help grads.Daily Free Press, Boston University, April 3, 2013.

Years after recession, Mass. job numbers finally bounce back. Daily Free Press, Boston University, April 2, 2013.

Popeye’s President Unconcerned About Obamacare, Says Health Insurance ‘Just Not Affordable’ Huffington Post. March 28th, 2013

Tax-based aid needs reform, report suggests. The Daily Free Press. Boston University. Feb 27, 2013.

Minimum wage-earners face hardship paying rent. The Daily Free Press, Boston University, Feb 5, 2013

Gas prices in Mass. shoot up 14 cents a gallon. The Daily Free Press, Boston University. Feb 5, 2013

Freshmen see college as necessary to riches, study suggests. The Daily Free Press, Boston University. Jan 30, 2013.

College debt high despite lower credit card, general debt. The Daily Free Press, Boston University. Dec 3, 2012

Government officials demand sales tax for items bought online. The Daily Free Press, Boston University. Nov 27, 2012

Interview with ASHEcon President Randy Ellis. American Society of Health Economists (ASHEcon) Newsletter Vol. 4 Fall 2010.



Insurers are doing well under ObamaCare

Much was made recently about how UnitedHealth decided to drop out of the ACA Federal Exchange in several states. It is important to realize that far from being a failure to large insurers (UnitedHealth is the largest insurer in the US), health insurance remains extremely profitably under ObamaCare. Below is a bar chart of the percentage change in stock prices of the five largest publicly traded health insurers in the US from March 23, 2010 when the ACA was signed to today (6/6/2016).

Top Five Health Insurer Stock Prices under ObamaCare


Hillary Clinton’s Memorial Day speech 2016 will be a classic.

My previous post was discouraging in that it was all about the techniques Donald Trump is using to build support. This one is much more upbeat.

I had heard about but not actually watched or listened to Hillary Clinton’s Memorial day speech, which has been getting a lot of favorable reviews. I just finished listening to it while checking email and it was indeed very inspiring, hopeful, and the best speech year from Hillary. I recommend it very highly, even if it is 34 minutes long. Here is one link from the Daily Kos on which you can watch or listen to the entire speech.

I also like that they post links to how it was received from various sites. Choose your favorite one to get a perspective.

If enough voters could watch this clip, it would change the entire discussion.


Sobering Scott Adams interview about Trump on Real Time

This email will interest people who are closely following the US presidential election.

I invite you to watch the following four minute video on Trump and Hillary when you have time. It changed my thinking.

On the Friday June 2  Real Time with Bill Mayer, Scott Adams (who is the genius creator of the Dilbert comic strip) gave a very sober assessment of why Trump has been so successful. Adams is a trained hypnotist and has written several books on the art of persuasion.  Adams has a blog and LAST SUMMER he predicted Trump would win by a landslide. This, plus his careful comments were very real to me. According to Adams, Trump is not a fool, but rather he is a high IQ master persuader.

The Real Time clip was taken down on YouTube, where they want you to subscribe to Real Time, but it is still linked here.

Here is a link to his the Scott Adams blog.

Here is  a link to his October 2015 blog where he predicted Trump will win in a landslide.

Here are the first three sentences from this blog from last October

“The latest poll out of Iowa shows Carson ahead of Trump. And you know what that means?

It means Iowa is about to become irrelevant. Here I am assuming evangelicals will band together to give Carson the win in Iowa before Trump goes on to run the table everywhere else.”

I also liked this review from the Washington Post in March 2016. It summarizes the six principles Trump is using to win.

Here are his principles in list form.

  1. Trump knows people are basically irrational.
  2. Knowing that people are irrational, Trump aims to appeal on an emotional level.
  3. By running on emotion, facts don’t matter.
  4. If facts don’t matter, you can’t really be “wrong.”
  5. With fewer facts in play, it’s easier to bend reality.
  6. To bend reality, Trump is a master of identity politics — and identity is the strongest persuader.

Here are the final lines from that Washington Post article.

Writes Adams: “Identity is always the strongest level of persuasion. The only way to beat it is with dirty tricks or a stronger identity play. … [And] Trump is well on his way to owning the identities of American, Alpha Males, and Women Who Like Alpha Males. Clinton is well on her way to owning the identities of angry women, beta males, immigrants, and disenfranchised minorities.

“If this were poker, which hand looks stronger to you for a national election?”

Watch the Real Time interview. And think about how you can get your favorite politician – local, state or national – to be better at persuasion and the use of modern media.


Congratulations to BU’s Class of 2016 Economics graduates!

Please celebrate the students who earned 498 Boston University degrees in Economics at Commencement this May.

This year the program mentions:

22 Ph.D. recipients

203 Master’s degree recipients (MA, MAPE, MAEP, MAGDE MA/MBA, BA/MA)

273 BA recipients (including BA/MA)

This total of 498 degrees is up from 482  in 2015.

These numbers undercount the total for the year since it may exclude students who graduated in January 2016 and chose not to appear at Commencement.

The number of graduate degree recipients 225 is way up from last year when we had 177, with most of the growth in MAs.

In 2015 there were 22 PhDs, 155 Master’s degree recipients, and 305 BA recipients.

In 2014 there were 17 PhDs, 207 Master’s degree recipients, and 256 BA recipients.

Altogether 24 Ph.D. students obtained jobs this year (versus 19 last year).

To see the Ph.D. placements visit the web site linked here.

The department’s  website now lists 38 regular faculty (down two from last year) with titles of assistant, associate or full professors, a number which is two below the number of professors in 2012.


Congratulations to all!

Top 100 Economics Blogs of 2016

I just got an email from Prateek Agarwal <>

He has compiled a list of the Top 100 Economics Blogs of 2016. I am of course not on it since I blog infrequently and do not archive (and make public) on my web site all of my blogs, but I thought I would share the link he provided.

Lots of interesting links, including The Incidental Economist, which is the only one I subscribe to. Be warned that reading blogs can be a major time waster..

I plan to archive this one on my web site blog.

Hope to see many of you at ASHEcon. (Not too late to sign up for the dinner)


24 BU economics Ph.D.s accept jobs in academic, government and private sectors

Congratulations to the 24 current or recent Ph.D. students from BU Economics who have accepted jobs for this September. Recent placements from the department are linked here, as well as pasted below.

This year candidates selected 14 academic jobs, five government or central bank jobs, and five private sector jobs for their initial placement.

Everyone formally on the market this year accepted  a job offer.

Thank you, BU colleagues and outside letter writers, for your help with them getting a job.
Let me also thank the five BU staff members who helped with the application process:
Gillian Gurish, Norma Hardeo, Miriam Hatoum, Deb Kasabian, Gloria Murray.
Special thanks to Gillian Gurish for her excellent web page support to the job seekers.

Randy Ellis, Economics department placement officer

2016 PhD Placements

Economics PhD Accepted Job Offers

Amazon, Seattle, WA Jiaxuan Li
Amazon, Seattle, WA Fan Zhuo
Bank of Canada, Ottawa, Canada Guihai Zhao
Central Bank of Chile, Financial Policy Division, Chile (senior economist)  J. Felipe Cordova
Central Bank of Chile, Chile Patricio Toro
Columbia University, Graduate School of Business, NY (post doc), then Federal Reserve Bank, Board of Governors, DC Levent Altinoglu
Cornerstone Research, Boston MA Francois Guay
Cornerstone Research, Chicago IL Kavan Kucko
Duke University, Sanford School of Public Policy, Raleigh NC (post doc) Matt Johnson
ECARES, Universite Libre de Bruxelles, Belgium (post doc) Ben Solow
Ernst & Young, NYC Mengmeng Li
Harvard Medical School, Department of Health Care Policy, Boston MA (asst. prof.) Tim Layton
Harvard Medical School, Mass General Hospital, Disparities Research Unit, Boston MA (lecturer/research scientist) Ye Wang
Harvard Medical School, Mass General Hospital, Disparities Research Unit, Boston MA (post doc) Mirk Fillbrunn
Holy Cross, MA (visiting asst. prof.) Dan Schwab
London School of Economics, Department of Social Policy and   Imperial College, School of Public Health, Health Economics Group, London UK (joint position, post doc) Sara Machado
Peking University, Marketing, Guanghua School of Management, China (asst. prof.) Ying Lei
Reserve Bank of India, Center for Advanced Financial Research and Learning, Mumbai India (research director) Apoorva Javadekar
Shanghai University of Finance and Economics, School of Economics, China (asst. prof.) Yao Shu
Shanghai University of Finance and Economics, School of Finance, China (asst. prof.) Ei Yang
Texas A&M, Department of Political Science, TX (asst. prof.) Benjamin Ogden
US Census Bureau, Center for Economic Studies, MD Elisabeth Perlman
Wellesley College, Department of Economics, MA (lecturer) Alex Poterack
Xiamen University, Department of Economics and Wang Yanan Institute for Studies in Economics (WISE), China (asst. prof.) Shuheng Lin

Obamacare reality: It is working

At a time in the US when all of the Republicans presidential candidates are declaring Obamacare a failure which needs to be undone, it is worth noting the REALITY that it is succeeding in its primary purpose of covering more American with health insurance. It does not mandate insurance coverage, but the subsidies and tax penalties for not having insurance are motivating more people to get insurance. 20 million more people now have health insurance than did before. (Click on graphs for a clearer image.)

 20 Million Gained Health Insurance From Obamacare, President Says
The Huffington Post

Uninsured rate Gallop-HealthwaysEven though cost containment was not its primary goal, Obamacare is also reducing, not increasing, costs of health care.
Since many people don’t trust the government, here are some private sector slides.
PriceWaterhouseCoopers, an actuary firm not known for being political, forecasts that health expenditure
cost growth in 2016 will continue to slow down.

Here are my two favorite slides from their chart pack. Note the changes since 2010.

pwc trends gdpand nhe

My view is that the above figure is misleading, since the decline in rates of growth did not start in 1961, but still the slow growth since 2010 is clearly evident.


spending growth rate PWC 2016

Obamacare is working. We just don’t have enough leaders and media telling us this.


Note: I sent this blog to my BUHealth email list.

Let me know if you would like to be added as a BUHealthFriends subscriber by emailing ellisrp at

What do BU undergraduates do when they graduate?

I am often asked by undergrads and MA students what BU students do when they graduate. For the first time that I know of, BU has published a relatively complete description and list of the places where they get jobs, or go to graduate school or  volunteer.  It is discussed in the new CAS Dean Ann Cudd’s newsletter which is linked here.

A Note from Dean Cudd: Preparing our Undergraduates for the ‘Real World’

Of particular interest is the pdf file listing where undergrads are working and what their job title is.

CLASS OF 2014: Post-Graduation First Destination Profile

From now on, this list will be one of the first places I suggest undergrad job seekers look for possibilities in the US, abroad, and to consider BU alumni to network with.

In addition to being their placement officer, I help maintain the list of recent BU placements of our Ph.D. students which is linked here.

Recent Ph.D. job placements.

By the way, it is also not too late for alumni from our Ph.D. program to interview and hire our candidates on the job market. (Another impressive group.) They are linked here.

Current Ph.D. job candidates.

Separately, in the last two days, I met with one undergraduate and two MA students during office hours, all who happened to be from China.  All three said that the BU program was much harder than the courses their friends were taking elsewhere as undergraduates or Master’s students. While they were in part complaining about too much work at BU, all of them were also grateful that they are getting a solid education and being challenged. It makes it a good investment. I told them to tell their friends in China and elsewhere. It also made me feel proud that BU has not succumbed to  the rampant grade inflation and work deflation that is common elsewhere.



You should get a large SSD hard drive

This email will interest anyone who is processing very large data files, such as 5 GB or more. Or, if you are frustrated with how long your Window’s updates and other IO intensive tasks take. (Most Apple users are probably already using SSD drives.)

In October the hard drive on my Windows Desktop failed (not entirely: it just became erratic), so I had to buy a replacement.

Old hard Drive: 1 Terabyte, spinning drive 7200 RPM, 2011 vintage
New hard drive: 1 Terabtye SSD solid state drive, 2015 Only $389 at Microcenter (some are cheaper now).

The BU IT department was able to clone my original hard drive so that I did not have to reinstall any of the software. It ran from the time I turned it on, except that it was much much faster. How much faster? Five to ten times faster on IO bound tasks. These graphs show the difference.


Various N

Times using mostly 0-1 binary regressors are much faster than continuous variables, since they can be compacted so nicely.

This graph is just illustrating that SAS can handle very big matrices well, although the sample sizes were fixed at 10k.

various kIf you are doing a lot of computationally intensive work on moderate size data, then faster CPU, multiple processors, and more RAM is critical. If you are processing Big Data, where the data is larger than your memory, then fast hard drives are the key. SSD drives are 5-10 times faster for most IO tasks.

You can also upgrade your laptop to a SSD drive with lots of capacity. New laptops with SSD and adequate memory will be faster on big data than your desktop with conventional hard drive. I am planning to get one to upgrade my old laptop. I will try to do a better job benchmarking before and after with that upgrade.

For sale at (Cambridge) near BU.

I purchased:
Samsung 850 EVO Series 1TB SATA III 6Gb/s mSATA Internal Solid State Drive Single Unit Version MZ-M5E1T0BW
Now $399.99, and was as low as $319. I predict prices to go down for Black Friday next week, and that prices were increased to ready for that “sale”.
“The MZ-M5E1T0BW from Samsung utilizes innovative 3D V-NAND Technology for incredible Read/Write Performance with enhanced endurance and reliability, giving you the most evolved SSD for Ultra-thin Laptops and PCs”

One review, probably by an employee…:
“All I can say is if you have an available mSATA slot open- just do it! That old spinning HD is killing your battery life! They only last a couple of years before they crash!! This is a solid state disk drive – No moving parts to wear out. Especially if you’re accident prone like me and drop it. SSD’s do not have head crashes like spinning hard drives.
The biggest bang for the buck is the performance! the read and write speeds are instantaneous! No waiting at all. Those Microsoft updates that take hours now take minutes. The mSATA drive is very easy to install. The bay is usually under the keyboard (Two screws to remove – Google it for instructions)- just get disk cloning software and follow the instructions. Remove the old spinning piece of rust and you’re off to the races! You can even get an external case to put your old hard drive in and use it for a backup. This little upgrade may breath new life into that old laptop – saving you from having to buy a newer one for a couple of more years…. I’m sure the laptop manufacturers don’t want to hear that!”

You will also need a mounting bracket to hold it in place. I used for a normal 3.5” slot:

Kingwin Internal Dual 2.5″ HDD/SSD to 3.5″ Plastic Mounting Kit

Cheaper now is:
Crucial BX100 1TB SATA III 6Gb/s 2.5″ Solid State Drive CT1000BX100SSD1
$309.99 in-store only

“Outlast and outperform your hard drive. Boot up almost instantly. Load programs in seconds. And accelerate demanding applications with ease. It all starts with ditching your hard drive. Engineered to outperform a hard drive and deliver cost-effective performance, the Crucial BX100 leverages advanced flash memory technology and moves your computer beyond the outdated storage limitations of spinning discs. By transmitting data in a digital manner rather than having to seek it out on a spinning platter, the Crucial BX100 is over 15x faster, 2x more reliable, and 2x more energy efficient than a typical hard drive.”

You will also need an adapter kit it make it fit in the larger size 3.5″ hard drive bays in most PC desktops. Such as

Vantec Dual 2.5″ to 3.5″ Hard Drive Mounting Kit $6.49

Talk to the staff about the computer you are putting the new hard drive into to get the right adappter kit.

Get help with installing it if you are not experienced. BU IT took less than a day (three hours) to install mine once it was purchased.


Denmark’s Social Capitalism and Switzerland’s Federal Democracy

With Democratic presidential candidates Hillary Clinton and Bernie Sanders both mentioning Democratic Socialism in Denmark, it is interesting to read about what it actually is.

Here is one link with one persons discussion.

Denmark sounds pretty wonderful to me


I just returned from Switzerland which is not democratic socialism, but rather a federalist direct democracy centered on capitalism much like the US.

The Swiss seem to be doing many things right.


The small city of Solothurn (pop 16,000) we stayed at had the following features. (Based on my visit, augmented by

A train, at least six bus lines and an electric trolley – for a city of 16,000!

Trains that runs on weekends almost as regularly as weekdays. Twice an hour on its two lines on Sunday mornings.

Hundreds (thousands?) of locals from the town using trains to get to the local cable car and go for hikes in the Alps on a Sunday morning in November.

23% foreign national residents

No driving in the center of the city. Only pedestrians or local residents and deliveries.

At least seven museums: art museum, rock carving museum, castle arsenal museum, nature museum, pinball museum, puppet museum, history museum

Trash containers every 100 feet along most public sidewalks.

Two pedestrian-only bridges across the Aare river (good crossword answer)

Bicycle parking for over 100 bikes at the train station.

No large supermarkets or malls that I saw.



Unemployment rate of 4.6% in 2010.

Minimum wage of $20 to $25 depending on canton. In May a national referendum to raise it to the equivalent of $24.70 narrowly failed.

Only 40.3% of the people use a car to get to work (40% walk or ride a bike, while 20% use public transport).

Considered the richest country in the world.



Median tax rate for a single person earning > $150,000 is 22% in 2011.

8% value added tax (national) plus a canton rate.

.3 to .5% property tax (national) (notice the decimal point)

Corporate profit tax of 8.5 (national) with some more by cantons (= states)

Overall fiscal rate for Switzerland was 38.5% in 2002.

Health (from Wikipedia on 11/12/15)

Swiss citizens are universally required to buy health insurance from private insurance companies, which in turn are required to accept every applicant. While the cost of the system is among the highest it compares well with other European countries in terms of health outcomes; patients who are citizens have been reported as being, in general, highly satisfied with it.[151][152][153] In 2012, life expectancy at birth was 80.4 years for men and 84.7 years for women[154] — the highest in the world.[155][156] However, spending on health is particularly high at 11.4% of GDP (2010), on par with Germany and France (11.6%) and other European countries, and notably less than spending in the USA (17.6%).[157] From 1990, a steady increase can be observed, reflecting the high costs of the services provided.[158] With an ageing population and new healthcare technologies, health spending will likely continue to rise.[158]


Like Denmark, Switzerland seems to run on trust.

This all sounds pretty attractive to me. Why can’t we look at successes in Europe more and imitate them?

Tax to rise on the uninsured next year.

This is why enrollment in health insurance will continue to rise in the US from the ACA.

For 2016: Max of  $695 or 2.5 percent of taxable income if uninsured.

Full text is below.

Health law fine for uninsured to rise

Boston Globe

Associated Press  October 19, 2015

WASHINGTON — The federal penalty for having no health insurance is set to jump to $695, and the Obama administration is being urged to highlight that fact in its new pitch for health law signups.

That means the 2016 signup season starting Nov. 1 could see penalties become a bigger focus for millions of people who have remained eligible for coverage but uninsured. They’re said to be squeezed for money and skeptical about spending what they have on health insurance.

Until now, health overhaul supporters have stressed the benefits: taxpayer subsidies that pay about 70 percent of the monthly premium, financial protection against sudden illness or an accident, and access to regular preventive and follow-up medical care.

But in 2016, the penalty for being uninsured will rise to the greater of either $695 or 2.5 percent of taxable income. That’s for someone without coverage for a full 12 months. This year the comparable numbers are $325, or 2 percent of income.

Marketing usually involves stressing the positive. Rising penalties meet no one’s definition of good news. Still, that may create a new pitch:

The math is pretty clear. A consumer would be able to get six months or more of coverage for $695, instead of owing that amount to the IRS as a tax penalty. (That is based on subsidized customers now putting in an average of about $100 a month of their own money.)”

BU ranked 41 overall, 24th in Economics, by US News and World Report

Since I have blogged about rankings in the past – here and here, I thought I would blog about BU’s latest rankings by US News and World Report and elsewhere.

BU was ranked 41, up one position in the 2015 as top US Colleges and Universities.

This ranking is across all fields, and is based mostly on survey results.

BU rankings in various subsets by USN&WR are linked here.

In Economics, BU was only ranked #24 by USN&WR, tied with Johns Hopkins, and just behind Brown, CMU, Duke, Maryland, Rochester.

This lower than hoped ratings of the department is not so surprising if you look at the USN&WR methodology:

“Rankings of doctoral programs in the social sciences and humanities are based solely on the results of peer assessment surveys sent to academics in each discipline.”

Peer assessments tend to change very slowly over time and our image before 2000 still enters into peoples ranking.

Economics had a 25 percent response rate from the department heads and directors of Graduate studies to whom they sent questionnaires.

BU tends to do better when using citations (currently ranked 12 by REPEC in the US behind Yale, Brown and Michigan, but ahead of U. Penn)

Another ranking is by QS World Universities where we are ranked 18 in the US 47th in the world, just behind Duke, Michigan, UCSD and Brown.

“The rankings highlight the world’s top universities in 36 individual subjects, based on academic reputation, employer reputation and research impact (full methodology here)”

The AEA own list of rankings, features several older ones.

iHEA Milan attracts twenty BU current and former students, faculty and visitors

Among the 1400 worldwide attendees, BU was again well represented at the International Health Economics Association biennial meetings in Milan Italy, 2015 with 20 current and former students, faculty and visitors present. Present were:

Osea Giuntella, Giulia La Mattina, Francesco Decarolis, Ana Balsa, Daniel Maceira, Julie Shi,  Michal Horny, Randy Ellis, Arturo Schweiger, Wenjia Zhu, Matilde Machado, Hsienming Lien, Jitian Sheu, Sara R Machado, Kathleen Carey, Alan B Cohen, Adam H Shapiro, Monica Galizzi, and  Mead Over.

Hope to see all of you in Boston iHEA 2017.


Congratulations to BU’s Class of 2015 Economics graduates!

Please celebrate the students who earned 482 Boston University degrees in Economics at Commencement over the weekend. This year the program contained:

22 Ph.D. recipients

155 Master’s degree recipients (MA, MAPE, MAEP, MAGDE MA/MBA, BA/MA)

305 BA recipients (including BA/MA)

This represents a total of 482 degrees! Up from 463 in 2014.

These numbers undercount the total for the year since it may exclude students who graduated in January 2014 and chose not to appear at Commencement.

The number of graduate degree recipients (177) is down from last year when we had 234.

Last year (2014) there were 17 PhDs, 207 Master’s degree recipients, and 256 BA recipients.

Altogether 19 Ph.D. students obtained jobs this year (versus 23 last year). This is an undercount since some obtained jobs directly. To see the Ph.D. placements visit the web site linked here.

The department’s  website now lists 40 regular faculty (up two) with titles of assistant, associate or full professors, a number which matches 2012.

Congratulations to all!

Ellis SAS tips for experienced SAS users

If you are a beginning SAS programmer, then the following may not be particularly helpful, but the books suggested in the middle may be. BU students can obtain a free license for SAS to install on their own computer if it is required for a course or research project. Both will require an email from an adviser. SAS is also available on various computers in the economics department computer labs.

I also created a Ellis SAS tips for new SAS programmers.

I do a lot of SAS programming on large datasets, and thought it would be productive to share some of my programming tips on SAS in one place. Large data is defined to be a dataset so large that it cannot be stored in the available memory. (My largest data file to date is 1.7 terabytes.)

Suggestions and corrections welcome!

Use SAS macro language whenever possible;

It is so much easier to work with short strings than long lists, especially with repeated models and datasteps;

%let rhs = Age Sex HCC001-HCC394;


Design your programs for efficient reading and writing of files, and minimize temporary datasets.

SAS programs on large data are generally constrained by IO (input output, reading from your hard drives), not by CPU (actual calculations) or memory (storage that disappears once your sas program ends). I have found that some computers with high speed CPU and multiple cores are slower than simpler computers because they are not optimized for speedy hard drives. Large memory really helps, but for really huge files it can almost almost be exceeded, and then your hard drive speeds will really matter. Even reading in and writing out files the hard drive speeds will be your limiting factor.

This implication of this is that you should do variable creation steps in as few datastep steps as possible, and minimize sorts, since reading and saving datasets will take a lot of time. This requires a real change in thinking from STATA, which is designed for changing one variable at a time on a rectangular file. Recall that STATA can do this efficiently since it usually starts by bringing the full dataset into memory before doing any changes. SAS does not do this, one of its strengths.

Learning to use DATA steps and PROC SQL is the central advantage of an experienced SAS programmer. Invest, and you will save time waiting for your programs to run.

Clean up your main hard drive if at all possible.

Otherwise you risk SAS crashing when your hard drive gets full. If it does, cancel the job and be sure to delete the temporary SAS datasets that may have been created before you crashed. The SAS default for storing temporary files is something like

C:\Users\”your_user_name”.AD\AppData\Local\Temp\SAS Temporary Files

Unless you have SAS currently open, you can safely delete all of the files stored in that directory. Ideally, there should be none since SAS deletes them when it closes normally. It is the abnormal endings of SAS that cause temporary files to be saved. Delete them, since they can be large!

Change the default hard drive for temporary files and sorting

If you have a large internal secondary hard drive with lots of space, then change the SAS settings so that it uses temp space on that drive for all work files and sorting operations.

To change this default location to a different internal hard drive, find your sasv9.cfg file which is in a location like

“C:\Program Files\SASHome\x86\SASFoundation\9.3\nls\en”

“C:\Program Files\SASHome2-94\SASFoundation\9.4\nls\en”

Find the line in the config firl that starts -WORK and change it to your own location for the temporary files (mine are on drive j and k) such as:

-WORK “k:\data\temp\SAS Temporary Files”

-UTILLOC “j:\data\temp\SAS Temporary Files”

The first one is where SAS stores its temporary work files such as WORK.ONE where you define the ONE such as by DATA ONE;

The second line is where SAS stores its own files such as when sorting a file or when saving residuals.

There is a reason to have the WORK and UTIL files on different drives, so that it is in generally reading in from one drive and writing out to a different one, rather than reading in and writing out on the same drive. Try to avoid the latter. Do some test on your own computer to see how much time you can save by switching from one drive to another instead of only using one drive.

Use only internal hard drives for routine programming

Very large files may require storage or back up on external hard drives, but these are incredibly slow. External drives are three to ten times slower than an internal hard drive. Try to minimize their use for actual project work. Instead, buy more internal drives if possible. You can purchase additional internal hard drives with 2T of space for under $100. You save that much in time the first day!

Always try to write large datasets to a different disk drive than you read them in from.

Do some tests copying large files from c: to c: and from C: to F: You may not notice any difference until the file sizes get truly huge, greater than your memory size.

Consider using binary compression to save space and time if you have a lot of binary variables.

By default, SAS stores datasets in  a fixed rectangular dataset that leaves lots of empty space when you use integers instead of real numbers. Although I have been a long time fan of using OPTIONS COMPRESS=YES to save space and run time (but not CPU time) I only recently discovered that


is even better for integers and binary flags when they outnumber real numbers. For some large datasets with lots of zero one dummies it has reduced my file size by as much as 97%! Standard variables are stored as 8 bytes, which have 8*256=2048 bits. In principle you could store 2000 binary flags in the space of one real number. Try saving some files on different compression and see if your run times and storage space improve. Note: compression INCREASES files size for real numbers! It seems that compression saves space when binary flags outnumber real numbers or integers;

Try various permutations on the following on you computer with your actual data to see what saves time and space;

data real;           retain x1-x100 1234567.89101112; do i = 1 to 100000; output; end;run; proc means; run;

data dummies; retain d1-d100 1;                                do i = 1 to 100000; output; end; proc means; run;

*try various datasteps with this, using the same or different drives. Bump up the obs to see how times change.


Create a macro file where you store macros that you want to have available anytime you need them. Do the same with your formats;

options nosource;
%include “c://data/projectname/macrofiles”;
%include “c://data/projectname/allformats”;
options source;

Be aware of which SAS procs create large, intermediate files

Some but not all procs create huge temporary datasets.

Consider: PROC REG, and PROC GLM generates all of the results in one pass through the data unless you have an OUTPUT statement. Then they create large,uncompressed, temporary files that can be a multiple of your original file sizes. PROC SURVEYREG and MIXED create large intermediate files even without an output statement. Plan accordingly.

Consider using OUTEST=BETA to more efficiently create residuals together with PROC SCORE.

Compare two ways of making residuals;

*make test dataset with ten million obs, but trivial model;

data test;
do i = 1 to 10000000;
retain junk1-junk100 12345;  * it is carrying along all these extra variables that slows SAS down;
x = rannor(234567);
y = x+rannor(12345);

Run;    * 30.2 seconds);
*Straightforward way; Times on my computer shown following each step;
proc reg data = test;
y: model y = x;
output out=resid (keep=resid) residual=resid;
run;  *25 seconds;
proc means data = resid;
run;  *.3 seconds;

*total of the above two steps is 25.6 seconds;

proc reg data = test outest=beta ;
resid: model y = x;
run;                     *3.9 seconds;
proc print data = beta;
run;  *take a look at beta that is created;
proc score data=test score=beta type=parms
out=resid (keep=resid) residual;
var x;
run;       *6 seconds!;
proc means data = resid;
run;  .3 seconds;

*total from the second method is 10.3 seconds versus 25.6 on the direct approach PLUS no temporary files needed to be created that may crash the system.

If the model statement in both regressions is

y: model y = x junk1-junk100; *note that all of the junk has coefficients of zero, but SAS does not this going in;

then the two times are

Direct approach:    1:25.84
Scoring approach:  1:12.46 on regression plus 9.01 seconds on score = 1:21.47 which is a smaller savings

On very large files the time savings are even greater because of the reduced IO gains; SAS is still able to do this without writing onto the hard drive in this “small” sample on my computer. But the real savings is on temporary storage space.

Use a bell!

My latest addition to my macro list is the following bell macro, which makes sounds.

Use %bell; at the end of your SAS program that you run batch and you may notice when the program has finished running.

%macro bell;
*plays the trumpet call, useful to put at end of batch program to know when the batch file has ended;
*Randy Ellis and Wenjia Zhu November 18 2014;
data _null_;
call sound(392.00,70); *first argument is frequency, second is duration;
call sound(523.25,70);
call sound(659.25,70);
call sound(783.99,140);
call sound(659.25,70);
call sound(783.99,350);

Purchase essential SAS programming guides.

I gave up on purchasing the paper copy of SAS manuals, because they take up more than two feet of shelf space, and are still not complete or up to date. I find the SAS help menus useful but clunky. I recommend the following if you are going to do serious SAS programming. Buy them used on Amazon or whatever. I would get an older edition, and it will cost less than $10 each. Really.

The Little SAS Book: A Primer, Fifth Edition (or an earlier one)

Nov 7, 2012

by Lora Delwiche and Susan Slaughter

Beginners introduction to SAS. Probably the best single book to buy when learning SAS.


Professional SAS Programmer’s Pocket Reference Paperback

By Rick Aster

Wonderful, concise summary of all of the main SAS commands, although you will have to already know SAS to find it useful. I use it to look up specific functions, macro commands, and optoins on various procs because it is faster than using the help menus. But I am old style…

Professional SAS Programming Shortcuts: Over 1,000 ways to improve your SAS programs Paperback

By Rick Aster

I don’t use this as much as the above, but if I had time, and were learning SAS instead of trying to rediscover things I already know, I would read through this carefully.

Get in the habit of deleting most intermediate permanent files

Delete files if either

1. You won’t need them again or

2. You can easily recreate them again.  *this latter point is usually true;

Beginner programmers tend to save too many intermediate files. Usually it is easier to rerun the entire program instead of saving the intermediate files. Give your final file of interest a name like MASTER or FULL_DATA then keep modifying it by adding variables instead of names like SORTED, STANDARDIZED,RESIDUAL,FITTED.

Consider a macro that helps make it easy to delete files.

%macro delete(library=work, data=temp, nolist=);

proc datasets library=&library &nolist;
delete &data;

*sample macro calls

%delete (data=temp);   *for temporary, work files you can also list multiple files names but these disappear anyway at the end of your run;

%delete (library =out, data = one two three) ; *for two node files in directory in;

%delete (library=out, data =one, nolist=nolist);   *Gets rid of list in output;



Ellis SAS tips for New SAS programmers

There is also a posting on Ellis SAS tips for Experienced SAS programmers

It focuses on issues when using large datasets.


Randy’s SAS hints for New SAS programmers, updated Feb 21, 2015


    begin and intermix your programs with internal documentation. (Note how I combined six forms of emphasis in ALWAYS: color, larger font, caps, bold, italics, underline.) Normally I recommend only one, but documenting your programs is really important. (Using only one form of emphasis is also important, just not really important.)

A simple example to start your program in SAS is

* Program = test1, Randy Ellis, first version: March 8, 2013 – test program on sas features

Any comment starting with an asterisk and ending in a semicolon is ignored;


    1. Most common errors/causes of wasted time while programming in SAS.

a. Forgetting semicolons at the end of a line

b. Omitting a RUN statement, and then waiting for the program to run.

c. Unbalanced single or double quotes.

d. Unintentionally commenting out more code than you intend to.

e. Foolishly running a long program on a large dataset that has not first been tested on a tiny one.

f. Trying to print out a large dataset which will overflow memory or hard drive space.

g. Creating an infinite loop in a datastep; Here is one silly one. Usually they can be much harder to identify.

data infinite_loop;
do while x=1;
if nevertrue =1 then x=0;

h. There are many other common errors and causes of wasted time. I am sure you will find your own


  1. With big datasets, 99 % of the time it pays to use the following system OPTIONS:


options compress =yes nocenter;


options compress =binary nocenter;

binary compression works particularly well with many binary dummy variables and sometimes is spectacular in saving 95%+ on storage space and hence speed.


/* mostly use */
options nocenter /* SAS sometimes spends many seconds figuring out how to center large print outs of
data or results. */
ps=9999               /* avoid unneeded headers and page breaks that split up long tables in output */
ls=200;                /* some procs like PROC MEANS give less output if a narrow line size is used */

*other key options to consider;

Options obs = max   /* or obs=100, Max= no limit on maximum number of obs processed */
Nodate nonumber /* useful if you don’t want SAS to embed headers at top of each page in listing */
Macrogen     /* show the SAS code generated after running the Macros. */
Mprint   /* show how macro code and macro variables resolve */
nosource /* suppress source code from long log */
nonotes   /* be careful, but can be used to suppress notes from log for long macro loops */

;                       *remember to always end with a semicolon!;


  1. Use these three key procedures regularly

Proc contents data=test; run; /* shows a summary of the file similar to Stata’s DESCRIBE */
Proc means data = test (obs=100000); run; /* set a max obs if you don’t want this to take too long */
Proc print data = test (obs=10); run;


I recommend you create and use regularly a macro that does all three easily:

%macro cmp(data=test);
Proc Contents data=&data; Proc means data = &data (obs=1000); Proc print data = &data (obs=10); run;

Then do all three (contents, means, print ten obs) with just

%cmp(data = mydata);


  1. Understand temporary versus permanent files;

Data one;   creates a temporary dataset that disappears when SAS terminates;

Data; creates a permanent dataset in the out directory that remains even if SAS terminates;


Define libraries (or directories):

Libname out “c:/data/marketscan/output”;
Libname in “c:/data/marketscan/MSdata”;


Output or data can be written into external files:

Filename textdata “c:/data/marketscan/textdata.txt”;


  1. Run tests on small samples to develop programs and then Toogle between tiny and large samples when debugged.

A simple way is

Options obs =10;
*options obs = max; *only use this when you are sure your programs run.

OR, some procedures and data steps using End= dataset option do not work well on partial samples. For those I often toggle between two different input libraries. Create a subset image of all of your data in a separate directory and then toggle using the libname commands;


*Libname in ‘c:/data/projectdata/fulldata’;
Libname in ‘c:/data/projectdata/testsample’;


Time spent creating a test data set is time well spent.

You could even write a macro to make it easy. (I leave it as an exercise!)


  1. Use arrays abundantly. You can use different array names to reference the same set of variables. This is very convenient;


%let rhs=x1 x2 y1 y2 count more;
Data _null_;
Array X {100} X001-X100; *usual form;
Array y {100} ;                     * creates y1-y100;
Array xmat {10,10} X001-X100; *matrix notation allows two dimensional indexes;
Array XandY {*} X001-X100 y1-y100 index ; *useful when you don’t know the count of variables in advance;
Array allvar &rhs. ;     *implicit arrays can use implicit indexes;

*see various ways of initializing the array elements to zero;

Do i = 1 to 100; x{i} = 0; end;

Do i = 1 to dim(XandY); XandY{i} = 0; end;


Do over allvar; allvar = 0; end;   *sometimes this is very convenient;


Do i=1 to 100 while (y(i) = . );
y{i} = 0;   *do while and do until are sometimes useful;



  1. For some purposes naming variables in arrays using leading zeros improves sort order of variables

Array x {100} X001-X100;
Array x {100} X1-X100;

With the second, the alphabetically sorted variables are x1,x10,x100, x11, x12,..,x19, x2,x20 , etc.


  1. Learn Set versus Merge command (Update is for rare, specialized use)


Data three;   *information on the same person combined into a single record;
Merge ONE TWO;


  1. Learn key dataset options like



  1. Keep files being sorted “skinny” by using drop or keep statements

Proc sort data = IN.BIG(keep=IDNO STATE COUNTY FROMDATE) out=out.bigsorted;

Also consider NODUP and NODUPKEY options to sort while dropping duplicate records, on all or on BY variables, respectively.


  1. Take advantage of BY group processing

Use FIRST.var and LAST.var abundantly.


USE special variables
_N_ = current observation counter
_ALL_ set of all variables such as Put _all_. Or when used with PROC CONTENTS, set of all datasets.


Also valuable is

PROC CONTENTS data = in._all_; run;


  1. Use lots of comments


* this is a standard SAS comment that ends with a semicolon;


/*   a PL1 style comment can comment out multiple lines including ordinary SAS comments;

* Like this; */


%macro junk; Macros can even comment out other macros or other pl1 style comments;

/*such as this; */ * O Boy!; %macro ignoreme;   mend; *very powerful;


%mend; * end macro junk;


  1. Use meaningful file names!

Data ONE TWO THREE can be useful.


  1. Put internal documentation about what the program does, who did it and when.
  2. Learn basic macro language; See SAS program demo for examples. Know the difference between executable and declarative statements used in DATA step


17. EXECUTABLE COMMANDS USED IN DATA STEP (Actually DO something, once for every record)


Y=y+x (assignment. In STATA you would use GEN y=x or REPLACE Y=X)
Do I = 1 to 10;
End; (always paired with DO, can be nested nearly unlimited deepness)


INFile in ‘c:/data/MSDATA/claimsdata.txt’;               define where input statements read from;
File out ‘c:/data/MSDATA/mergeddata.txt’;             define where put statements write to;


Goto johnny;      * always avoid. Use do groups instead;


IF a=b THEN y=0 ;
ELSE y=x; * be careful when multiple if statements;
CALL subroutine(); (Subroutines are OK, Macros are better)


INPUT   X ; (read in one line of X as text data from INFILE)
PUT   x y= / z date.; (Write out results to current LOG or FILE file)


BY IDNO;         *   Match up with BY variable IDNO as you simultaneously read in A&B;

Both files must already be sorted by IDNO.

SET A B;                                           * read in order, first all of A, and then all of B;

UPDATE   A B; *replace variables with new values from B only if non missing in B;


OUTPUT out.A;      *Write out one obs to out.A SAS dataset;
OUTPUT;                *Writes out one obs of every output file being created;

DELETE;   * do not output this record, and return to the top of the datastep;

STOP;                               * ends the current SAS datastep;


18. Assignment commands for DATA Step are

only done once at the start of the data step



*This would create three data sets, named ONE TWO and IN.THREE

Only the third one will be kept once SAS terminates.;

Array x {10} x01-x10;
ATTRIB x length =16 Abc length=$8;
BY state county IDNO;
Also consider  
BY DESCENDING IDNO; or BY IDNO UNSORTED; if grouped but not sorted by IDNO;
DROP i;   * do not keep i in final data set, although it can still be used while the data step is running
KEEP IDNO AGE SEX; *this will drop all variables from output file except these three;
FORMAT x date.;   *permanently link the format DATE. To the variable link;


LABEL AGE2010 = “Age on December 31 2010”;
LENGTH x 8; *must be assigned the first time you reference the variable;
RENAME AGE = AGE2010; After this point you must use the newname (AGE2010);
OPTIONS NOBS=100; One of many options. Note done only once.


19. Key Systems language commands

LIBNAME to define libraries
FILENAME to define specific files, such as for text data to input or output text



%LET year=2011;

%LET ABC = “Randy Ellis”;


20. Major procs you will want to master

DATA step !!!!! Counts as a procedure;





PROC FREQ                      frequencies

PROC SUMMARY      (Can be done using MEANS, but easier)

PROC CORR (Can be done using Means or Summary)


PROC GLM   General Linear Models with automatically created fixed effects



PROC GENMOD nonlinear models

PROG SURVEYREG clustered errors

None of the above will execute unless a new PROC is started OR you include a RUN; statement.

21. Formats are very powerful. Here is an example from the MarketScan data. One use is to simply recode variables so that richer labels are possible.


Another use is to look up or merge on other information in large files.


Proc format;
value $region
1=’1-Northeast Region           ‘
2=’2-North Central Region       ‘
3=’3-South Region               ‘
4=’4-West Region               ‘
5=’5-Unknown Region             ‘


value $sex

1=‘1-Male           ‘
2=‘2-Female         ‘
other=‘ Missing/Unknown’



*Three different uses of formats;

Data one ;
Label sex = ‘patient sex =1 if male’;
label region = census region;

Proc print data = one;



data two;
set one;
Format sex $sex.; * permanently assigns sex format to this variable and stores format with the dataset;

Proc print data = two;

Proc contents data = two;

*be careful if the format is very long!;


Data three;
Set one;

*maps sex into the label, and saves a new variable as the text strings. Be careful can be very long;

Proc print data =three;


Proc print data = one;
Format sex $sex.;
*this is almost always the best way to use formats: Only on your results of procs, not saved as part of the datasets;


If you are trying to learn SAS on your own, then I recommend you buy:

The Little SAS Book: A Primer, Fifth Edition (or an earlier one)

Nov 7, 2012

by Lora Delwiche and Susan Slaughter

Beginners introduction to SAS. Probably the best single book to buy when learning SAS.

Deflategate pressure drop is consistent with a ball air temperature of 72 degrees when tested initially.

Deflategate pressure drop is consistent with a ball air temperature of 72 degrees when tested initially.

I revised my original Deflategate posting after learning that it is absolute air pressure not pressure above standard sea level pressure that follows the Ideal Gas Law.  I also allowed for stretching of the leather once the ball becomes wet. And for the possibility that the cold rain was was colder (45 degrees F) below the recorded air temperature at 53 degrees F.  Together these adjustments make it even easier for the weather to fully explain the drop in ball pressure.

My Bottom Line: The NFL owes the Patriot Nation and Bob Kraft a big apology.

Correction #1: My initial use of the ideal gas formula did not recognize that it is absolute pressure, not pressure above the ambient air pressure that matters. Hence a ball with a pressure of 12.5 PSI is actually 12.5 PSI above the surrounding air pressure, which is about 14 PSI at sea level. So a decline from 12.5 PSI to 10.5 PSI is actually only an 8.2 percent decline in absolute pressure from 26.5 to 24.5 PSI. This makes it much easier for temperature changes to explain the difference in ball pressure. Only an 8.2 percent change in absolute temperature (approximately a 42 degree Fahrenheit drop) would be required it that were the only change needed.

Correction #2: It is well established that water allows leather to stretch. I found one site that noted that water can allow leather to stretch by 2-5% when wet.  It does not specify how much force is needed to achieve this.;_ylt=A0LEVvwgfs9UP0AAr40nnIlQ?qid=20060908234923AAxt7xP

It is plausible that a new ball made of leather under pressure (scuffed up to let in the moisture quickly)  might stretch 1 percent upon getting wet (such as in the rain). Since volume goes up with the cube of this stretching, this would be a (1.01)^3 -1= 3 percent increase in ball volume or decline in pressure. This amount would reduces the absolute temperature difference needed for the 2 PSI drop to only 5.2 percent (a change of only 27 degrees F.)

Correction #3: It was raining on game day, and the rain was probably much colder than the outside air temperature. So it is plausible that the game ball was as cold as 45 degrees Fahrenheit at game time when the low ball pressures were detected. This makes even lower initial testing temperatures consistent with the professed levels of underinflation.

A single formula can be used to calculate the ball temperature needed when tested initially to explain a ball pressure detected during the game that is 2 PSI lower, after getting colder (to 45 degrees F), .004 smaller (since ball volume shrinks when cold), and stretched 1% due to rain. It would be

Pregame testing temperature in F =(pressure change as a ratio)/(volume change due to cold)/(volume change due to leather stretching 1% when wet)*(45 degree ball temperature during game+460 degrees) – 460 degrees

(12.5+14)/(10.5+14)/(.996)/(1.01^3)(45+460) – 460 = 72 degrees Fahrenheit

Given this math, it would have been surprising if the ball pressure had NOT declined significantly.

Final comment #1: All of these calculations and hypotheses can be tested empirically. See the empirical analysis done by Headsmart Labs ( They find that a rain plus a 25 degree drop is consistent with a 1.82 PSI decrease.

Final comment #2: Since the original game balls were reinflated by officials during halftime, the true ball pressures during the first half will never be known. Moreover there seems to be no documentary record of their pressures at the time they were re-inflated.

The XLIX Superbowl was a terrific game from the point of view of Patriots fans. Now it is time for the NFL  to own up to its own mistake in accusing the Patriots of cheating.  It was just a matter of physics.

Revised calculations


Various combinations of testing temperatures and PSI
Adjustments for temperature only, correcting for absolute pressure at 14 PSI at sea level Adjustments for changes in ball volume Adjusting for temperature and football volume
Temperature F Degrees above Absolute zero Temperature adjustment Various game time or testing PSI readings surface area sphere radius mean football radius volume Volume adjustment Various game time or testing PSI readings
Game time temperature 45 505 1.000 10.5 11 11.5 189 3.8782 3.81183 232 1.000 10.5 11 11.5
60 520 1.030 11.2 11.7 12.3 189.2427 3.8807 3.81427 232.447 0.998 11.3 11.8 12.3
70 530 1.050 11.7 12.2 12.8 189.4045 3.8824 3.81590 232.7451 0.997 11.8 12.3 12.8
Possibl e testing temperatures 80 540 1.069 12.2 12.7 13.3 189.5663 3.8840 3.81753 233.0434 0.996 12.3 12.9 13.4
90 550 1.089 12.7 13.2 13.8 189.7280 3.8857 3.81916 233.3418 0.994 12.8 13.4 13.9
100 560 1.109 13.2 13.7 14.3 189.8898 3.8873 3.82079 233.6403 0.993 13.4 13.9 14.5
110 570 1.129 13.7 14.2 14.8 190.0516 3.8890 3.82242 233.939 0.992 13.9 14.5 15.0
120 580 1.149 14.1 14.7 15.3 190.2134 3.8906 3.82404 234.2378 0.990 14.4 15.0 15.6
130 590 1.168 14.6 15.2 15.8 190.3752 3.8923 3.82567 234.5367 0.989 14.9 15.5 16.1
140 600 1.188 15.1 15.7 16.3 190.5370 3.8940 3.82730 234.8357 0.988 15.5 16.1 16.7
150 610 1.208 15.6 16.2 16.8 190.6988 3.8956 3.82892 235.1349 0.987 16.0 16.6 17.2
160 620 1.228 16.1 16.7 17.3 190.8606 3.8973 3.83054 235.4342 0.985 16.5 17.1 17.8
Temperature (Fo) at which ball would pass test. 2 PSI diff 1.5 PSI diff 1 PSI diff 88 77 67
Temperature only 86 75 65
Temperature and volume change from temp 88 77 67
temp, volume, and stretching from wetness 72 62 51
Last row calculated as (12.5+14)/(inferred test level+14)/(0.996)/(1.01^3)*(45+460)-460
Revised calculations allow for sea level temperature to be 14 PSI, so a change from 10.5 to 12.5 PSI (above this level requires only a (12.5+14)/(10.5+14)-1=8.2 percent change in absolute temperature.
See notes at the top, but final calculations also allow for the possiblities that ball temperature was 45 degrees, not 53 due to cold rain, and 1% stretching in leather due to rain.
Fields in first row and first column are input parameters, others are calculated


Original post

There is no mention of the temperature at which the footballs need to be stored or tested in the official NFL rule book. (Sloppy rules!)

The process of scuffing up the new balls to make them feel better no doubt warms them up. It would be impossible for it to be otherwise. An empirical question is how much did it warm them up and what temperature were they when tested?

Surface temps could have been below their internal temperature of the air, which is what matters for the pressure. Leather is a pretty good insulator (hence its use in many coats).

Anyone who took high school physics may remember that pressure and temperature satisfy


Pressure*Volume=Number of moles*ideal gas constant*Temperature  (Ideal Gas Law)

Temperature needs to be measured in degrees above absolute zero, which is -459.67 Fahrenheit (sorry metric readers!). The temperature at game time was 53 degrees. So the right question to ask is:At what temperature,  T1, would the air in the ball have to be at the time the balls were tested such that once they cooled down to T0=53 degrees they measures two pounds per square inch (PSI) below the allowed minimum?

The lowest allowed temperature for testing was 12.5 PSI. We are told only vaguely that the balls were 2 PSI lower than this, but this is not a precise number. It could be it was rounded from 1.501 PSI. that would mean they  might have been 11 pounds PSI when tested during the game.  I examine 10.5, 11 and 11.5 as possible game time test PSI levels.The following tables shows possible combinations of game time testing temperature and half-time testing temperatures that would be consistent with various pressures.The right hand side of the table makes an adjustment for the fact that the leather/rubber in the ball would also have shrunk as the ball cooled down, which works against the temperature.Using the formulaPSI1=PSI0*((T1+459.67)/(T0+459.67). (See correction above!) Ignoring the volume change of the ball, it is straightforward to solve for what initial temperature the balls would have had to be for the observed game time temperatures.

Adjusting for a plausible guess at the small amount that the leather plus rubber bladder would have also changed makes only a small difference.

For a 1.5 PSI difference from testing to halftime , the air inside of them would have had to be at about 128 degrees at the time they were tested. (The leather skin could have been a lower temperature.) This would have made them feel warm but not burning hot to the hand.

Allowing the balls to be warm when tested is sneaky or perhaps accidental, but not cheating.

Go Pats!

Various combinations of testing temperatures and PSI
Adjustments for temperature only Adjustments for changes in ball volume Adjusting for temperature and football volume
Temperature F Degrees above Absolute zero Temperature adjustment Various game time or testing PSI readings surface area sphere radius mean football radius volume Volume adjustment Various game time or testing PSI readings
Game time temperature 53 512.67 1.000 10.5 11 11.5 189 3.8782 3.81183 232 1.000 10.5 11 11.5
Possibl e testing temperatures 80 539.67 1.053 11.1 11.6 12.1 189.4368 3.8827 3.81623 232.8048 1.003 11.0 11.5 12.1
90 549.67 1.072 11.3 11.8 12.3 189.5986 3.8844 3.81786 233.1031 1.005 11.2 11.7 12.3
100 559.67 1.092 11.5 12.0 12.6 189.7604 3.8860 3.81949 233.4015 1.006 11.4 11.9 12.5
110 569.67 1.111 11.7 12.2 12.8 189.9222 3.8877 3.82112 233.7001 1.007 11.6 12.1 12.7
120 579.67 1.131 11.9 12.4 13.0 190.0840 3.8893 3.82274 233.9988 1.009 11.8 12.3 12.9
130 589.67 1.150 12.1 12.7 13.2 190.2458 3.8910 3.82437 234.2976 1.010 12.0 12.5 13.1
140 599.67 1.170 12.3 12.9 13.5 190.4076 3.8926 3.82600 234.5965 1.011 12.1 12.7 13.3
150 609.67 1.189 12.5 13.1 13.7 190.5693 3.8943 3.82762 234.8956 1.012 12.3 12.9 13.5
160 619.67 1.209 12.7 13.3 13.9 190.7311 3.8959 3.82924 235.1948 1.014 12.5 13.1 13.7
Temperature (Fo) at which ball would pass test. 151 123 98 159 128 101
Fields in yellow are input parameters, others are calculated
Column C is temperature minus absolute zero
Column D is the ratio of column C to the game time temp in absolute degrees and shows how much higher PSI would have been than at game time.
Columns E through G show possible testing PSI for three possible game time PSI levels.
Columns H through L show adjustments to volume which tend to reduce the PSI as a ball is heated. Calculations use rate of expansion of hard rubber per square inch per degree.
Columns M through O show Balll PSI after adjusting for both air temperature and football volume
Parameters and formulas
absolute zero= -459.67 fahrenheit
hard rubber expansion 42.8 (10-6 in/(in oF))*)
or 0.0000428 Used for column I expansion of surface area
Surface area assume to grow with the square of this proportion with temperature.
The approximate volume and surface area of a standard football are 232 cubic inches and 189 square inches, respectively.
Surface of a sphere formula
4pr2 Used to calculate radius of sphere
volume of sphere formula
4/3*pi*radius3 Used to calculate volume of football. Volume adjusted downward by a fixed proportion because footballs are not spheres.


NFL rules

Rule 2 The BallSection 1BALL DIMENSIONSThe Ball must be a “Wilson,” hand selected, bearingthe signature of the Commissioner of the League, Roger Goodell.The ball shall be made up of an inflated (12 1/2 to 13 1/2 pounds) urethane bladder enclosed in a pebble grained, leather case(natural tan color) without corrugations of any kind. It shall have the form of a prolate spheroid and the size and weightshall be: long axis, 11 to 11 1/4 inches; long circumference, 28 to 28 1/2 inches; short circumference, 21 to 21 1/4 inches;weight, 14 to 15 ounces.The Referee shall be the sole judge as to whether all balls offered for play comply with these specifications. A pump is to befurnished by the home club, and the balls shall remain under the supervision of the Referee until they are delivered to theball attendant just prior to the start of the game.

From the Free Dictionaryideal gas lawn.A physical law describing the relationship of the measurable properties of an ideal gas, where P (pressure) × V (volume) = n (number of moles) × R (the gas constant) × T (temperature in Kelvin). It is derived from a combination of the gas laws of Boyle, Charles, and Avogadro. Also called universal gas law.


#6 Raise the minimum wage for jobs not offering health insurance

Time to change the policy discussion.

Congress has been unwilling to raise the minimum wage despite strong public support for doing so. This blog suggests a concrete approach for getting even broader public support and potentially reducing the need for federal taxes.

As of January 1, 2015 29 states and DC have minimum wages above the Federal minimum wage, which is still only $7.25 per hour. For a worker working 40 hours per week 50 weeks per year, the minimum wage yields only $14,500 per year, which is below the federal poverty level ($15,730) for a  family of two in 2014 in all states and DC. At these low income levels, even full time employees still cannot afford health insurance and will mostly be relying on large subsidies for health insurance  and the employee earned tax credit (EIT).  The insurance subsidy for a minimum wage worker enrolling in a private silver plan is currently at least $4,237 per year for an adult with one child, while the EIT is currently $3,359 for a single worker with one child if earning the minimum wage. Hence an employer paying only the minimum wage is counting on a subsidy from the government of at least $7596 per year for a worker with one child, which is $3.80 per hour.

A simple approach that will encourage more firms to offer health insurance is to raise the minimum wage required for any position that does not include any offer of subsidized health insurance. For concreteness I propose a minimum wage of $12 per hour without health insurance, versus $8 per hour with a job that includes subsidies for health insurance. (Those age 21 and under would also be eligible for the $8 per hour rate.) Whether the job is for 10, 30 or 40 hours per week does not matter, only whether there is subsidized health insurance. This four dollar per hour increment will encourage firms to bear the full cost of their workers, and reduce the burden on federal tax revenue and the budget.

In Massachusetts, the minimum wage just increased on January 1, 2015 from $8 to $9 per hour. The State’s economy continues to do well, and I still see signs in retail windows showing help is still wanted. Plus we still have lots of low-cost food and retail stores and services. Reduced employment is not visible, and would likely be more than offset by the stimulatory effects of reduced taxes. I see no reason why we couldn’t leave it up to states to decide whether they want to use the same or higher minimum wages for jobs with or without health insurance as long as the two minimums are reached.

In Australia, the minimum wage is US$ 13.84 (16.87 Australian dollars), everyone has national health insurance, and the unemployment rate is comparable to the US at 6.2 percent (November, 2014). We rather liked it when we were there in 2011 that our gardener and most restaurant workers were Australian citizens, who spoke English well, not low-paid foreigners and recent immigrants, as they are in the US.

As I write this blog, the US congress is debating whether to partially undo the employer mandate provisions of the Affordable Care Act by allowing firms to not have to pay any penalty for not offering health insurance for employees working less than 40 hours per week. The current standard is 30 hours per week. This would have a potentially disastrous effect since so many workers work about 40 hours and it would be easy for employers  to avoid the (modest) ACA penalties by reducing worker hours. Plus, without the employer mandate, many workers will remain uninsured. Having a higher minimum wage for jobs not offered health insurance will greatly weaken the incentive for firms to drop employee hours to avoid offering insurance coverage and eliminate the 40 versus 30 hours as an issue. In fact it would encourage firms to offer full- rather than part-time jobs with health insurance, reducing the need for public subsidies.

This minimum wage policy particularly makes sense if it is combined with the proposal in my next (future) blog #7 to eliminate all family health insurance policies, insure individuals not households, and have all children under age 21 be covered independently of their parent’s insurance policy. Making all children eligible for the exchange coverage options regardless of their (parent’s) income would be one possible approach.

Here are links to my four previous blogs from 2013 on Taxes and fiscal policies. Still the right direction.

#1 All Taxes and Budgets Should be Expressed as Dollars per Person

#2. Include Social Security and Medicare taxes when discussing tax burdens

#3 Tax Bads (or at least don’t subsidize them!)

#4 State Tax Rates are Not Related to State Income or Growth

#5 “Let the Children and Grandchildren Pay?”




Recommended book on US health care system

I highly recommend this book as a useful summary of the US Health Care System. I have made it required reading (as a reference) for my classes at BU.

The Health Care Handbook: A Clear and Concise Guide to the United States Health Care System, 2nd Edition Paperback – November 15, 2014

by Elisabeth Askin (Author), Nathan Moore (Author)


Paper:  $15.99

Electronic: $8.99