Category Archives: BUHealth

iHEA Milan attracts twenty BU current and former students, faculty and visitors

Among the 1400 worldwide attendees, BU was again well represented at the International Health Economics Association biennial meetings in Milan Italy, 2015 with 20 current and former students, faculty and visitors present. Present were:

Osea Giuntella, Giulia La Mattina, Francesco Decarolis, Ana Balsa, Daniel Maceira, Julie Shi,  Michal Horny, Randy Ellis, Arturo Schweiger, Wenjia Zhu, Matilde Machado, Hsienming Lien, Jitian Sheu, Sara R Machado, Kathleen Carey, Alan B Cohen, Adam H Shapiro, Monica Galizzi, and  Mead Over.

Hope to see all of you in Boston iHEA 2017.

 

Ellis SAS tips for experienced SAS users

If you are a beginning SAS programmer, then the following may not be particularly helpful, but the books suggested in the middle may be. BU students can obtain a free license for SAS to install on their own computer if it is required for a course or research project. Both will require an email from an adviser. SAS is also available on various computers in the economics department computer labs.

I also created a Ellis SAS tips for new SAS programmers.

I do a lot of SAS programming on large datasets, and thought it would be productive to share some of my programming tips on SAS in one place. Large data is defined to be a dataset so large that it cannot be stored in the available memory. (My largest data file to date is 1.7 terabytes.)

Suggestions and corrections welcome!

Use SAS macro language whenever possible;

It is so much easier to work with short strings than long lists, especially with repeated models and datasteps;

%let rhs = Age Sex HCC001-HCC394;

 

Design your programs for efficient reading and writing of files, and minimize temporary datasets.

SAS programs on large data are generally constrained by IO (input output, reading from your hard drives), not by CPU (actual calculations) or memory (storage that disappears once your sas program ends). I have found that some computers with high speed CPU and multiple cores are slower than simpler computers because they are not optimized for speedy hard drives. Large memory really helps, but for really huge files it can almost almost be exceeded, and then your hard drive speeds will really matter. Even reading in and writing out files the hard drive speeds will be your limiting factor.

This implication of this is that you should do variable creation steps in as few datastep steps as possible, and minimize sorts, since reading and saving datasets will take a lot of time. This requires a real change in thinking from STATA, which is designed for changing one variable at a time on a rectangular file. Recall that STATA can do this efficiently since it usually starts by bringing the full dataset into memory before doing any changes. SAS does not do this, one of its strengths.

Learning to use DATA steps and PROC SQL is the central advantage of an experienced SAS programmer. Invest, and you will save time waiting for your programs to run.

Clean up your main hard drive if at all possible.

Otherwise you risk SAS crashing when your hard drive gets full. If it does, cancel the job and be sure to delete the temporary SAS datasets that may have been created before you crashed. The SAS default for storing temporary files is something like

C:\Users\"your_user_name".AD\AppData\Local\Temp\SAS Temporary Files

Unless you have SAS currently open, you can safely delete all of the files stored in that directory. Ideally, there should be none since SAS deletes them when it closes normally. It is the abnormal endings of SAS that cause temporary files to be saved. Delete them, since they can be large!

Change the default hard drive for temporary files and sorting

If you have a large internal secondary hard drive with lots of space, then change the SAS settings so that it uses temp space on that drive for all work files and sorting operations.

To change this default location to a different internal hard drive, find your sasv9.cfg file which is in a location like

"C:\Program Files\SASHome\x86\SASFoundation\9.3\nls\en"

"C:\Program Files\SASHome2-94\SASFoundation\9.4\nls\en"

Find the line in the config firl that starts -WORK and change it to your own location for the temporary files (mine are on drive j and k) such as:

-WORK "k:\data\temp\SAS Temporary Files"

-UTILLOC "j:\data\temp\SAS Temporary Files"

The first one is where SAS stores its temporary work files such as WORK.ONE where you define the ONE such as by DATA ONE;

The second line is where SAS stores its own files such as when sorting a file or when saving residuals.

There is a reason to have the WORK and UTIL files on different drives, so that it is in generally reading in from one drive and writing out to a different one, rather than reading in and writing out on the same drive. Try to avoid the latter. Do some test on your own computer to see how much time you can save by switching from one drive to another instead of only using one drive.

Use only internal hard drives for routine programming

Very large files may require storage or back up on external hard drives, but these are incredibly slow. External drives are three to ten times slower than an internal hard drive. Try to minimize their use for actual project work. Instead, buy more internal drives if possible. You can purchase additional internal hard drives with 2T of space for under $100. You save that much in time the first day!

Always try to write large datasets to a different disk drive than you read them in from.

Do some tests copying large files from c: to c: and from C: to F: You may not notice any difference until the file sizes get truly huge, greater than your memory size.

Consider using binary compression to save space and time if you have a lot of binary variables.

By default, SAS stores datasets in  a fixed rectangular dataset that leaves lots of empty space when you use integers instead of real numbers. Although I have been a long time fan of using OPTIONS COMPRESS=YES to save space and run time (but not CPU time) I only recently discovered that

OPTIONS COMPRESS=BINARY;

is even better for integers and binary flags when they outnumber real numbers. For some large datasets with lots of zero one dummies it has reduced my file size by as much as 97%! Standard variables are stored as 8 bytes, which have 8*256=2048 bits. In principle you could store 2000 binary flags in the space of one real number. Try saving some files on different compression and see if your run times and storage space improve. Note: compression INCREASES files size for real numbers! It seems that compression saves space when binary flags outnumber real numbers or integers;

Try various permutations on the following on you computer with your actual data to see what saves time and space;

data real;           retain x1-x100 1234567.89101112; do i = 1 to 100000; output; end;run; proc means; run;

data dummies; retain d1-d100 1;                                do i = 1 to 100000; output; end; proc means; run;

*try various datasteps with this, using the same or different drives. Bump up the obs to see how times change.

 

Create a macro file where you store macros that you want to have available anytime you need them. Do the same with your formats;

options nosource;
%include "c://data/projectname/macrofiles";
%include "c://data/projectname/allformats";
options source;

Be aware of which SAS procs create large, intermediate files

Some but not all procs create huge temporary datasets.

Consider: PROC REG, and PROC GLM generates all of the results in one pass through the data unless you have an OUTPUT statement. Then they create large,uncompressed, temporary files that can be a multiple of your original file sizes. PROC SURVEYREG and MIXED create large intermediate files even without an output statement. Plan accordingly.

Consider using OUTEST=BETA to more efficiently create residuals together with PROC SCORE.

Compare two ways of making residuals;

*make test dataset with ten million obs, but trivial model;

data test;
do i = 1 to 10000000;
retain junk1-junk100 12345;  * it is carrying along all these extra variables that slows SAS down;
x = rannor(234567);
y = x+rannor(12345);
output;
end;

Run;    * 30.2 seconds);
*Straightforward way; Times on my computer shown following each step;
proc reg data = test;
y: model y = x;
output out=resid (keep=resid) residual=resid;
run;  *25 seconds;
proc means data = resid;
run;  *.3 seconds;

*total of the above two steps is 25.6 seconds;

proc reg data = test outest=beta ;
resid: model y = x;
run;                     *3.9 seconds;
proc print data = beta;
run;  *take a look at beta that is created;
proc score data=test score=beta type=parms
out=resid (keep=resid) residual;
var x;
run;       *6 seconds!;
proc means data = resid;
run;  .3 seconds;

*total from the second method is 10.3 seconds versus 25.6 on the direct approach PLUS no temporary files needed to be created that may crash the system.

If the model statement in both regressions is

y: model y = x junk1-junk100; *note that all of the junk has coefficients of zero, but SAS does not this going in;

then the two times are

Direct approach:    1:25.84
Scoring approach:  1:12.46 on regression plus 9.01 seconds on score = 1:21.47 which is a smaller savings

On very large files the time savings are even greater because of the reduced IO gains; SAS is still able to do this without writing onto the hard drive in this "small" sample on my computer. But the real savings is on temporary storage space.

Use a bell!

My latest addition to my macro list is the following bell macro, which makes sounds.

Use %bell; at the end of your SAS program that you run batch and you may notice when the program has finished running.

%macro bell;
*plays the trumpet call, useful to put at end of batch program to know when the batch file has ended;
*Randy Ellis and Wenjia Zhu November 18 2014;
data _null_;
call sound(392.00,70); *first argument is frequency, second is duration;
call sound(523.25,70);
call sound(659.25,70);
call sound(783.99,140);
call sound(659.25,70);
call sound(783.99,350);
run;
%mend;
%bell;

Purchase essential SAS programming guides.

I gave up on purchasing the paper copy of SAS manuals, because they take up more than two feet of shelf space, and are still not complete or up to date. I find the SAS help menus useful but clunky. I recommend the following if you are going to do serious SAS programming. Buy them used on Amazon or whatever. I would get an older edition, and it will cost less than $10 each. Really.

The Little SAS Book: A Primer, Fifth Edition (or an earlier one)

Nov 7, 2012

by Lora Delwiche and Susan Slaughter

Beginners introduction to SAS. Probably the best single book to buy when learning SAS.

 

Professional SAS Programmer's Pocket Reference Paperback

By Rick Aster

http://www.amazon.com/Professional-SAS-Programmers-Pocket-Reference/dp/189195718X

Wonderful, concise summary of all of the main SAS commands, although you will have to already know SAS to find it useful. I use it to look up specific functions, macro commands, and optoins on various procs because it is faster than using the help menus. But I am old style...

Professional SAS Programming Shortcuts: Over 1,000 ways to improve your SAS programs Paperback

By Rick Aster

http://www.amazon.com/Professional-SAS-Programming-Shortcuts-programs/dp/1891957198/ref=sr_1_1?s=books&ie=UTF8&qid=1417616508&sr=1-1&keywords=professional+sas+programming+shortcuts

I don't use this as much as the above, but if I had time, and were learning SAS instead of trying to rediscover things I already know, I would read through this carefully.

Get in the habit of deleting most intermediate permanent files

Delete files if either

1. You won't need them again or

2. You can easily recreate them again.  *this latter point is usually true;

Beginner programmers tend to save too many intermediate files. Usually it is easier to rerun the entire program instead of saving the intermediate files. Give your final file of interest a name like MASTER or FULL_DATA then keep modifying it by adding variables instead of names like SORTED, STANDARDIZED,RESIDUAL,FITTED.

Consider a macro that helps make it easy to delete files.

%macro delete(library=work, data=temp, nolist=);

proc datasets library=&library &nolist;
delete &data;
run;
%mend;

*sample macro calls

%delete (data=temp);   *for temporary, work files you can also list multiple files names but these disappear anyway at the end of your run;

%delete (library =out, data = one two three) ; *for two node files in directory in;

%delete (library=out, data =one, nolist=nolist);   *Gets rid of list in output;

 

 

Ellis SAS tips for New SAS programmers

There is also a posting on Ellis SAS tips for Experienced SAS programmers

It focuses on issues when using large datasets.

 

Randy’s SAS hints for New SAS programmers, updated Feb 21, 2015

  1. ALWAYS

    begin and intermix your programs with internal documentation. (Note how I combined six forms of emphasis in ALWAYS: color, larger font, caps, bold, italics, underline.) Normally I recommend only one, but documenting your programs is really important. (Using only one form of emphasis is also important, just not really important.)

A simple example to start your program in SAS is

******************
* Program = test1, Randy Ellis, first version: March 8, 2013 – test program on sas features
***************;

Any comment starting with an asterisk and ending in a semicolon is ignored;

 

    1. Most common errors/causes of wasted time while programming in SAS.

a. Forgetting semicolons at the end of a line

b. Omitting a RUN statement, and then waiting for the program to run.

c. Unbalanced single or double quotes.

d. Unintentionally commenting out more code than you intend to.

e. Foolishly running a long program on a large dataset that has not first been tested on a tiny one.

f. Trying to print out a large dataset which will overflow memory or hard drive space.

g. Creating an infinite loop in a datastep; Here is one silly one. Usually they can be much harder to identify.

data infinite_loop;
x=1;
nevertrue=0;
do while x=1;
if nevertrue =1 then x=0;
end;
run;

h. There are many other common errors and causes of wasted time. I am sure you will find your own

 

  1. With big datasets, 99 % of the time it pays to use the following system OPTIONS:

 

options compress =yes nocenter;

or

options compress =binary nocenter;

binary compression works particularly well with many binary dummy variables and sometimes is spectacular in saving 95%+ on storage space and hence speed.

 

/* mostly use */
options nocenter /* SAS sometimes spends many seconds figuring out how to center large print outs of
data or results. */
ps=9999               /* avoid unneeded headers and page breaks that split up long tables in output */
ls=200;                /* some procs like PROC MEANS give less output if a narrow line size is used */
 

*other key options to consider;

Options obs = max   /* or obs=100, Max= no limit on maximum number of obs processed */
Nodate nonumber /* useful if you don’t want SAS to embed headers at top of each page in listing */
Macrogen     /* show the SAS code generated after running the Macros. */
Mprint   /* show how macro code and macro variables resolve */
nosource /* suppress source code from long log */
nonotes   /* be careful, but can be used to suppress notes from log for long macro loops */

;                       *remember to always end with a semicolon!;

 

  1. Use these three key procedures regularly

Proc contents data=test; run; /* shows a summary of the file similar to Stata’s DESCRIBE */
Proc means data = test (obs=100000); run; /* set a max obs if you don’t want this to take too long */
Proc print data = test (obs=10); run;

 

I recommend you create and use regularly a macro that does all three easily:

%macro cmp(data=test);
Proc Contents data=&data; Proc means data = &data (obs=1000); Proc print data = &data (obs=10); run;
%end;

Then do all three (contents, means, print ten obs) with just

%cmp(data = mydata);

 

  1. Understand temporary versus permanent files;

Data one;   creates a work.one temporary dataset that disappears when SAS terminates;

Data out.one; creates a permanent dataset in the out directory that remains even if SAS terminates;

 

Define libraries (or directories):

Libname out “c:/data/marketscan/output”;
Libname in “c:/data/marketscan/MSdata”;
 

 

Output or data can be written into external files:

Filename textdata “c:/data/marketscan/textdata.txt”;

 

  1. Run tests on small samples to develop programs and then Toogle between tiny and large samples when debugged.

A simple way is

Options obs =10;
*options obs = max; *only use this when you are sure your programs run.
 

OR, some procedures and data steps using End= dataset option do not work well on partial samples. For those I often toggle between two different input libraries. Create a subset image of all of your data in a separate directory and then toggle using the libname commands;

 

*Libname in ‘c:/data/projectdata/fulldata’;
Libname in ‘c:/data/projectdata/testsample’;

 

Time spent creating a test data set is time well spent.

You could even write a macro to make it easy. (I leave it as an exercise!)

 

  1. Use arrays abundantly. You can use different array names to reference the same set of variables. This is very convenient;

 

%let rhs=x1 x2 y1 y2 count more;
Data _null_;
Array X {100} X001-X100; *usual form;
Array y {100} ;                     * creates y1-y100;
Array xmat {10,10} X001-X100; *matrix notation allows two dimensional indexes;
Array XandY {*} X001-X100 y1-y100 index ; *useful when you don’t know the count of variables in advance;
Array allvar &rhs. ;     *implicit arrays can use implicit indexes;
 

*see various ways of initializing the array elements to zero;

Do i = 1 to 100; x{i} = 0; end;
 

Do i = 1 to dim(XandY); XandY{i} = 0; end;

 

Do over allvar; allvar = 0; end;   *sometimes this is very convenient;

 

Do i=1 to 100 while (y(i) = . );
y{i} = 0;   *do while and do until are sometimes useful;
end;

 

run;

  1. For some purposes naming variables in arrays using leading zeros improves sort order of variables

Use:
Array x {100} X001-X100;
not
Array x {100} X1-X100;

With the second, the alphabetically sorted variables are x1,x10,x100, x11, x12,..,x19, x2,x20 , etc.

 

  1. Learn Set versus Merge command (Update is for rare, specialized use)

 

Data three;   *information on the same person combined into a single record;
Merge ONE TWO;
BY IDNO;
Run;

 

  1. Learn key dataset options like

Obs=
Keep=
Drop=
In=
Firstobs=
Rename=(oldname=newname)
End=

 

  1. Keep files being sorted “skinny” by using drop or keep statements

Proc sort data = IN.BIG(keep=IDNO STATE COUNTY FROMDATE) out=out.bigsorted;
BY STATE COUNTY IDNO FROMDATE;
Run;

Also consider NODUP and NODUPKEY options to sort while dropping duplicate records, on all or on BY variables, respectively.

 

  1. Take advantage of BY group processing

Use FIRST.var and LAST.var abundantly.

 

USE special variables
_N_ = current observation counter
_ALL_ set of all variables such as Put _all_. Or when used with PROC CONTENTS, set of all datasets.

 

Also valuable is

PROC CONTENTS data = in._all_; run;

 

  1. Use lots of comments

 

* this is a standard SAS comment that ends with a semicolon;

 

/*   a PL1 style comment can comment out multiple lines including ordinary SAS comments;

* Like this; */

 

%macro junk; Macros can even comment out other macros or other pl1 style comments;

/*such as this; */ * O Boy!; %macro ignoreme;   mend; *very powerful;

 

%mend; * end macro junk;

 

  1. Use meaningful file names!

Data ONE TWO THREE can be useful.

 

  1. Put internal documentation about what the program does, who did it and when.
  2. Learn basic macro language; See SAS program demo for examples. Know the difference between executable and declarative statements used in DATA step

 

17. EXECUTABLE COMMANDS USED IN DATA STEP (Actually DO something, once for every record)

 

Y=y+x (assignment. In STATA you would use GEN y=x or REPLACE Y=X)
 
Do I = 1 to 10;
End; (always paired with DO, can be nested nearly unlimited deepness)

 

INFile in ‘c:/data/MSDATA/claimsdata.txt’;               define where input statements read from;
File out ‘c:/data/MSDATA/mergeddata.txt’;             define where put statements write to;

 

Goto johnny;      * always avoid. Use do groups instead;

 

IF a=b THEN y=0 ;
ELSE y=x; * be careful when multiple if statements;
CALL subroutine(); (Subroutines are OK, Macros are better)

 

INPUT   X ; (read in one line of X as text data from INFILE)
PUT   x y= / z date.; (Write out results to current LOG or FILE file)

 

MERGE IN.A IN.B ;
BY IDNO;         *   Match up with BY variable IDNO as you simultaneously read in A&B;

Both files must already be sorted by IDNO.

SET A B;                                           * read in order, first all of A, and then all of B;

UPDATE   A B; *replace variables with new values from B only if non missing in B;

 

OUTPUT out.A;      *Write out one obs to out.A SAS dataset;
OUTPUT;                *Writes out one obs of every output file being created;

DELETE;   * do not output this record, and return to the top of the datastep;

STOP;                               * ends the current SAS datastep;

 

18. Assignment commands for DATA Step are

only done once at the start of the data step

 

DATA ONE TWO IN.THREE;

*This would create three data sets, named ONE TWO and IN.THREE

Only the third one will be kept once SAS terminates.;

Array x {10} x01-x10;
ATTRIB x length =16 Abc length=$8;
RETAIN COUNT 0;
BY state county IDNO;
Also consider  
BY DESCENDING IDNO; or BY IDNO UNSORTED; if grouped but not sorted by IDNO;
DROP i;   * do not keep i in final data set, although it can still be used while the data step is running
KEEP IDNO AGE SEX; *this will drop all variables from output file except these three;
FORMAT x date.;   *permanently link the format DATE. To the variable link;

INFORMAT ABC $4.;

LABEL AGE2010 = “Age on December 31 2010”;
LENGTH x 8; *must be assigned the first time you reference the variable;
RENAME AGE = AGE2010; After this point you must use the newname (AGE2010);
OPTIONS NOBS=100; One of many options. Note done only once.

 

19. Key Systems language commands

LIBNAME to define libraries
FILENAME to define specific files, such as for text data to input or output text

TITLE THIS TITLE WILL APPEAR ON ALL OUTPUT IN LISTING until a new title line is given;

%INCLUDE

%LET year=2011;

%LET ABC = “Randy Ellis”;

 

20. Major procs you will want to master

DATA step !!!!! Counts as a procedure;

PROC CONTENTS

PROC PRINT

PROC MEANS

PROC SORT

PROC FREQ                      frequencies

PROC SUMMARY      (Can be done using MEANS, but easier)

PROC CORR (Can be done using Means or Summary)

PROC REG       OLS or GLS

PROC GLM   General Linear Models with automatically created fixed effects

PROC FORMAT /INFORMAT

PROC UNIVARIATE

PROC GENMOD nonlinear models

PROG SURVEYREG clustered errors

None of the above will execute unless a new PROC is started OR you include a RUN; statement.

21. Formats are very powerful. Here is an example from the MarketScan data. One use is to simply recode variables so that richer labels are possible.

 

Another use is to look up or merge on other information in large files.

 

Proc format;
value $region
1=’1-Northeast Region           ‘
2=’2-North Central Region       ‘
3=’3-South Region               ‘
4=’4-West Region               ‘
5=’5-Unknown Region             ‘
;

 

value $sex

1=‘1-Male           ‘
2=‘2-Female         ‘
other=‘ Missing/Unknown’

;

 

*Three different uses of formats;

Data one ;
sex=’1’;
region=1;
Label sex = ‘patient sex =1 if male’;
label region = census region;
run;

Proc print data = one;

Run;

 

data two;
set one;
Format sex $sex.; * permanently assigns sex format to this variable and stores format with the dataset;
Run;

Proc print data = two;
Run;

Proc contents data = two;
Run;

*be careful if the format is very long!;

 

Data three;
Set one;
Charsex=put(sex,$sex.);
Run;

*maps sex into the label, and saves a new variable as the text strings. Be careful can be very long;

Proc print data =three;
Run;

 

Proc print data = one;
Format sex $sex.;
*this is almost always the best way to use formats: Only on your results of procs, not saved as part of the datasets;
Run;

 

If you are trying to learn SAS on your own, then I recommend you buy:

The Little SAS Book: A Primer, Fifth Edition (or an earlier one)

Nov 7, 2012

by Lora Delwiche and Susan Slaughter

Beginners introduction to SAS. Probably the best single book to buy when learning SAS.

Deflategate pressure drop is consistent with a ball air temperature of 72 degrees when tested initially.

Deflategate pressure drop is consistent with a ball air temperature of 72 degrees when tested initially.

I revised my original Deflategate posting after learning that it is absolute air pressure not pressure above standard sea level pressure that follows the Ideal Gas Law.  I also allowed for stretching of the leather once the ball becomes wet. And for the possibility that the cold rain was was colder (45 degrees F) below the recorded air temperature at 53 degrees F.  Together these adjustments make it even easier for the weather to fully explain the drop in ball pressure.

My Bottom Line: The NFL owes the Patriot Nation and Bob Kraft a big apology.

Correction #1: My initial use of the ideal gas formula did not recognize that it is absolute pressure, not pressure above the ambient air pressure that matters. Hence a ball with a pressure of 12.5 PSI is actually 12.5 PSI above the surrounding air pressure, which is about 14 PSI at sea level. So a decline from 12.5 PSI to 10.5 PSI is actually only an 8.2 percent decline in absolute pressure from 26.5 to 24.5 PSI. This makes it much easier for temperature changes to explain the difference in ball pressure. Only an 8.2 percent change in absolute temperature (approximately a 42 degree Fahrenheit drop) would be required it that were the only change needed.

Correction #2: It is well established that water allows leather to stretch. I found one site that noted that water can allow leather to stretch by 2-5% when wet.  It does not specify how much force is needed to achieve this.

https://answers.yahoo.com/question/index;_ylt=A0LEVvwgfs9UP0AAr40nnIlQ?qid=20060908234923AAxt7xP

It is plausible that a new ball made of leather under pressure (scuffed up to let in the moisture quickly)  might stretch 1 percent upon getting wet (such as in the rain). Since volume goes up with the cube of this stretching, this would be a (1.01)^3 -1= 3 percent increase in ball volume or decline in pressure. This amount would reduces the absolute temperature difference needed for the 2 PSI drop to only 5.2 percent (a change of only 27 degrees F.)

Correction #3: It was raining on game day, and the rain was probably much colder than the outside air temperature. So it is plausible that the game ball was as cold as 45 degrees Fahrenheit at game time when the low ball pressures were detected. This makes even lower initial testing temperatures consistent with the professed levels of underinflation.

A single formula can be used to calculate the ball temperature needed when tested initially to explain a ball pressure detected during the game that is 2 PSI lower, after getting colder (to 45 degrees F), .004 smaller (since ball volume shrinks when cold), and stretched 1% due to rain. It would be

Pregame testing temperature in F =(pressure change as a ratio)/(volume change due to cold)/(volume change due to leather stretching 1% when wet)*(45 degree ball temperature during game+460 degrees) - 460 degrees

(12.5+14)/(10.5+14)/(.996)/(1.01^3)(45+460) - 460 = 72 degrees Fahrenheit

Given this math, it would have been surprising if the ball pressure had NOT declined significantly.

Final comment #1: All of these calculations and hypotheses can be tested empirically. See the empirical analysis done by Headsmart Labs (http://www.headsmartlabs.com). They find that a rain plus a 25 degree drop is consistent with a 1.82 PSI decrease.

Final comment #2: Since the original game balls were reinflated by officials during halftime, the true ball pressures during the first half will never be known. Moreover there seems to be no documentary record of their pressures at the time they were re-inflated.

The XLIX Superbowl was a terrific game from the point of view of Patriots fans. Now it is time for the NFL  to own up to its own mistake in accusing the Patriots of cheating.  It was just a matter of physics.

Revised calculations

 

Various combinations of testing temperatures and PSI
A B C D E F G H I J K L M N O
Adjustments for temperature only, correcting for absolute pressure at 14 PSI at sea level Adjustments for changes in ball volume Adjusting for temperature and football volume
Temperature F Degrees above Absolute zero Temperature adjustment Various game time or testing PSI readings surface area sphere radius mean football radius volume Volume adjustment Various game time or testing PSI readings
Game time temperature 45 505 1.000 10.5 11 11.5 189 3.8782 3.81183 232 1.000 10.5 11 11.5
60 520 1.030 11.2 11.7 12.3 189.2427 3.8807 3.81427 232.447 0.998 11.3 11.8 12.3
70 530 1.050 11.7 12.2 12.8 189.4045 3.8824 3.81590 232.7451 0.997 11.8 12.3 12.8
Possibl e testing temperatures 80 540 1.069 12.2 12.7 13.3 189.5663 3.8840 3.81753 233.0434 0.996 12.3 12.9 13.4
90 550 1.089 12.7 13.2 13.8 189.7280 3.8857 3.81916 233.3418 0.994 12.8 13.4 13.9
100 560 1.109 13.2 13.7 14.3 189.8898 3.8873 3.82079 233.6403 0.993 13.4 13.9 14.5
110 570 1.129 13.7 14.2 14.8 190.0516 3.8890 3.82242 233.939 0.992 13.9 14.5 15.0
120 580 1.149 14.1 14.7 15.3 190.2134 3.8906 3.82404 234.2378 0.990 14.4 15.0 15.6
130 590 1.168 14.6 15.2 15.8 190.3752 3.8923 3.82567 234.5367 0.989 14.9 15.5 16.1
140 600 1.188 15.1 15.7 16.3 190.5370 3.8940 3.82730 234.8357 0.988 15.5 16.1 16.7
150 610 1.208 15.6 16.2 16.8 190.6988 3.8956 3.82892 235.1349 0.987 16.0 16.6 17.2
160 620 1.228 16.1 16.7 17.3 190.8606 3.8973 3.83054 235.4342 0.985 16.5 17.1 17.8
Temperature (Fo) at which ball would pass test. 2 PSI diff 1.5 PSI diff 1 PSI diff 88 77 67
Temperature only 86 75 65
Temperature and volume change from temp 88 77 67
temp, volume, and stretching from wetness 72 62 51
Last row calculated as (12.5+14)/(inferred test level+14)/(0.996)/(1.01^3)*(45+460)-460
Notes
Revised calculations allow for sea level temperature to be 14 PSI, so a change from 10.5 to 12.5 PSI (above this level requires only a (12.5+14)/(10.5+14)-1=8.2 percent change in absolute temperature.
See notes at the top, but final calculations also allow for the possiblities that ball temperature was 45 degrees, not 53 due to cold rain, and 1% stretching in leather due to rain.
Fields in first row and first column are input parameters, others are calculated

 

Original post

There is no mention of the temperature at which the footballs need to be stored or tested in the official NFL rule book. (Sloppy rules!)

The process of scuffing up the new balls to make them feel better no doubt warms them up. It would be impossible for it to be otherwise. An empirical question is how much did it warm them up and what temperature were they when tested?

Surface temps could have been below their internal temperature of the air, which is what matters for the pressure. Leather is a pretty good insulator (hence its use in many coats).

Anyone who took high school physics may remember that pressure and temperature satisfy

PV=nRT

Pressure*Volume=Number of moles*ideal gas constant*Temperature  (Ideal Gas Law)

Temperature needs to be measured in degrees above absolute zero, which is -459.67 Fahrenheit (sorry metric readers!). The temperature at game time was 53 degrees. So the right question to ask is:At what temperature,  T1, would the air in the ball have to be at the time the balls were tested such that once they cooled down to T0=53 degrees they measures two pounds per square inch (PSI) below the allowed minimum?

The lowest allowed temperature for testing was 12.5 PSI. We are told only vaguely that the balls were 2 PSI lower than this, but this is not a precise number. It could be it was rounded from 1.501 PSI. that would mean they  might have been 11 pounds PSI when tested during the game.  I examine 10.5, 11 and 11.5 as possible game time test PSI levels.The following tables shows possible combinations of game time testing temperature and half-time testing temperatures that would be consistent with various pressures.The right hand side of the table makes an adjustment for the fact that the leather/rubber in the ball would also have shrunk as the ball cooled down, which works against the temperature.Using the formulaPSI1=PSI0*((T1+459.67)/(T0+459.67). (See correction above!) Ignoring the volume change of the ball, it is straightforward to solve for what initial temperature the balls would have had to be for the observed game time temperatures.

Adjusting for a plausible guess at the small amount that the leather plus rubber bladder would have also changed makes only a small difference.

For a 1.5 PSI difference from testing to halftime , the air inside of them would have had to be at about 128 degrees at the time they were tested. (The leather skin could have been a lower temperature.) This would have made them feel warm but not burning hot to the hand.

Allowing the balls to be warm when tested is sneaky or perhaps accidental, but not cheating.

Go Pats!

Various combinations of testing temperatures and PSI
A B C D E F G H I J K L M N O
Adjustments for temperature only Adjustments for changes in ball volume Adjusting for temperature and football volume
Temperature F Degrees above Absolute zero Temperature adjustment Various game time or testing PSI readings surface area sphere radius mean football radius volume Volume adjustment Various game time or testing PSI readings
Game time temperature 53 512.67 1.000 10.5 11 11.5 189 3.8782 3.81183 232 1.000 10.5 11 11.5
Possibl e testing temperatures 80 539.67 1.053 11.1 11.6 12.1 189.4368 3.8827 3.81623 232.8048 1.003 11.0 11.5 12.1
90 549.67 1.072 11.3 11.8 12.3 189.5986 3.8844 3.81786 233.1031 1.005 11.2 11.7 12.3
100 559.67 1.092 11.5 12.0 12.6 189.7604 3.8860 3.81949 233.4015 1.006 11.4 11.9 12.5
110 569.67 1.111 11.7 12.2 12.8 189.9222 3.8877 3.82112 233.7001 1.007 11.6 12.1 12.7
120 579.67 1.131 11.9 12.4 13.0 190.0840 3.8893 3.82274 233.9988 1.009 11.8 12.3 12.9
130 589.67 1.150 12.1 12.7 13.2 190.2458 3.8910 3.82437 234.2976 1.010 12.0 12.5 13.1
140 599.67 1.170 12.3 12.9 13.5 190.4076 3.8926 3.82600 234.5965 1.011 12.1 12.7 13.3
150 609.67 1.189 12.5 13.1 13.7 190.5693 3.8943 3.82762 234.8956 1.012 12.3 12.9 13.5
160 619.67 1.209 12.7 13.3 13.9 190.7311 3.8959 3.82924 235.1948 1.014 12.5 13.1 13.7
Temperature (Fo) at which ball would pass test. 151 123 98 159 128 101
Notes
Fields in yellow are input parameters, others are calculated
Column C is temperature minus absolute zero
Column D is the ratio of column C to the game time temp in absolute degrees and shows how much higher PSI would have been than at game time.
Columns E through G show possible testing PSI for three possible game time PSI levels.
Columns H through L show adjustments to volume which tend to reduce the PSI as a ball is heated. Calculations use rate of expansion of hard rubber per square inch per degree.
Columns M through O show Balll PSI after adjusting for both air temperature and football volume
Parameters and formulas
absolute zero= -459.67 fahrenheit
hard rubber expansion 42.8 (10-6 in/(in oF))*) http://www.engineeringtoolbox.com/linear-expansion-coefficients-d_95.html
or 0.0000428 Used for column I expansion of surface area
Surface area assume to grow with the square of this proportion with temperature.
The approximate volume and surface area of a standard football are 232 cubic inches and 189 square inches, respectively.
http://www.answers.com/Q/Volume_and_surface_area_of_a_football
Surface of a sphere formula
4pr2 Used to calculate radius of sphere
volume of sphere formula
4/3*pi*radius3 Used to calculate volume of football. Volume adjusted downward by a fixed proportion because footballs are not spheres.

 

NFL rules

Rule 2 The BallSection 1BALL DIMENSIONSThe Ball must be a “Wilson,” hand selected, bearingthe signature of the Commissioner of the League, Roger Goodell.The ball shall be made up of an inflated (12 1/2 to 13 1/2 pounds) urethane bladder enclosed in a pebble grained, leather case(natural tan color) without corrugations of any kind. It shall have the form of a prolate spheroid and the size and weightshall be: long axis, 11 to 11 1/4 inches; long circumference, 28 to 28 1/2 inches; short circumference, 21 to 21 1/4 inches;weight, 14 to 15 ounces.The Referee shall be the sole judge as to whether all balls offered for play comply with these specifications. A pump is to befurnished by the home club, and the balls shall remain under the supervision of the Referee until they are delivered to theball attendant just prior to the start of the game.

From the Free Dictionaryideal gas lawn.A physical law describing the relationship of the measurable properties of an ideal gas, where P (pressure) × V (volume) = n (number of moles) × R (the gas constant) × T (temperature in Kelvin). It is derived from a combination of the gas laws of Boyle, Charles, and Avogadro. Also called universal gas law.

 

#6 Raise the minimum wage for jobs not offering health insurance

Time to change the policy discussion.

Congress has been unwilling to raise the minimum wage despite strong public support for doing so. This blog suggests a concrete approach for getting even broader public support and potentially reducing the need for federal taxes.

As of January 1, 2015 29 states and DC have minimum wages above the Federal minimum wage, which is still only $7.25 per hour. For a worker working 40 hours per week 50 weeks per year, the minimum wage yields only $14,500 per year, which is below the federal poverty level ($15,730) for a  family of two in 2014 in all states and DC. At these low income levels, even full time employees still cannot afford health insurance and will mostly be relying on large subsidies for health insurance  and the employee earned tax credit (EIT).  The insurance subsidy for a minimum wage worker enrolling in a private silver plan is currently at least $4,237 per year for an adult with one child, while the EIT is currently $3,359 for a single worker with one child if earning the minimum wage. Hence an employer paying only the minimum wage is counting on a subsidy from the government of at least $7596 per year for a worker with one child, which is $3.80 per hour.

A simple approach that will encourage more firms to offer health insurance is to raise the minimum wage required for any position that does not include any offer of subsidized health insurance. For concreteness I propose a minimum wage of $12 per hour without health insurance, versus $8 per hour with a job that includes subsidies for health insurance. (Those age 21 and under would also be eligible for the $8 per hour rate.) Whether the job is for 10, 30 or 40 hours per week does not matter, only whether there is subsidized health insurance. This four dollar per hour increment will encourage firms to bear the full cost of their workers, and reduce the burden on federal tax revenue and the budget.

In Massachusetts, the minimum wage just increased on January 1, 2015 from $8 to $9 per hour. The State's economy continues to do well, and I still see signs in retail windows showing help is still wanted. Plus we still have lots of low-cost food and retail stores and services. Reduced employment is not visible, and would likely be more than offset by the stimulatory effects of reduced taxes. I see no reason why we couldn't leave it up to states to decide whether they want to use the same or higher minimum wages for jobs with or without health insurance as long as the two minimums are reached.

In Australia, the minimum wage is US$ 13.84 (16.87 Australian dollars), everyone has national health insurance, and the unemployment rate is comparable to the US at 6.2 percent (November, 2014). We rather liked it when we were there in 2011 that our gardener and most restaurant workers were Australian citizens, who spoke English well, not low-paid foreigners and recent immigrants, as they are in the US.

As I write this blog, the US congress is debating whether to partially undo the employer mandate provisions of the Affordable Care Act by allowing firms to not have to pay any penalty for not offering health insurance for employees working less than 40 hours per week. The current standard is 30 hours per week. This would have a potentially disastrous effect since so many workers work about 40 hours and it would be easy for employers  to avoid the (modest) ACA penalties by reducing worker hours. Plus, without the employer mandate, many workers will remain uninsured. Having a higher minimum wage for jobs not offered health insurance will greatly weaken the incentive for firms to drop employee hours to avoid offering insurance coverage and eliminate the 40 versus 30 hours as an issue. In fact it would encourage firms to offer full- rather than part-time jobs with health insurance, reducing the need for public subsidies.

This minimum wage policy particularly makes sense if it is combined with the proposal in my next (future) blog #7 to eliminate all family health insurance policies, insure individuals not households, and have all children under age 21 be covered independently of their parent's insurance policy. Making all children eligible for the exchange coverage options regardless of their (parent's) income would be one possible approach.

Here are links to my four previous blogs from 2013 on Taxes and fiscal policies. Still the right direction.

#1 All Taxes and Budgets Should be Expressed as Dollars per Person

#2. Include Social Security and Medicare taxes when discussing tax burdens

#3 Tax Bads (or at least don’t subsidize them!)

#4 State Tax Rates are Not Related to State Income or Growth

#5 “Let the Children and Grandchildren Pay?”

 

 

 

Recommended book on US health care system

I highly recommend this book as a useful summary of the US Health Care System. I have made it required reading (as a reference) for my classes at BU.

The Health Care Handbook: A Clear and Concise Guide to the United States Health Care System, 2nd Edition Paperback – November 15, 2014

by Elisabeth Askin (Author), Nathan Moore (Author)

 

Paper:  $15.99

http://www.amazon.com/gp/product/0692244735

Electronic: $8.99

http://www.amazon.com/Health-Care-Handbook-Concise-United-ebook/dp/B00PWQ93M8/

 

Re-envisioning Ebola, including updated story about Nigeria from Kas Nwuke

Arlene Ash, Professor and Division Chief, Biostatistics and Health Services Research, at UMass Medical School, has compiled a useful series of original thoughts, emails, and links about Ebola which I am broadcasting and reposting on my blog site here.

This posting repeats some of the information already posted in my earlier blog:

Ebola is being contained in Nigeria

The original article by Kas Nwuke is now linked (with permission) as a pdf and includes linked references on my web site. (It is 6 pages – updated to include two pages of references.)

Containing Ebola: A success story from an “unexpected” place?

From Arlene Ash:

Friends and Colleagues,

Here’s what I [Arlene Ash] sent previously with some updates.

I now have Mead Over’s permission to circulate his text that is included below, plus sharing the link to his Twitter log: @MeadOver.

Also, I have added the text from yesterday’s NYT editorial “Cuba’s Impressive Role on Ebola,” since non-subscribers may not be able to get it themselves on-line.  The full text, with links and commentary, is very interesting, and I think important.

These are, indeed, extraordinary times – and, I firmly believe, they offer an extraordinary opportunity to discard old, dysfunctional paradigms – if only we can seize it.

Arlene

_

Last weekend I [Arlene Ash] wrote:

Re-envisioning Ebola as an opportunity

Friends, If you like this idea as well as I do, perhaps you can help make it “go viral.”

  •  I believe it would be cheaper to stop Ebola in Africa than to try to seal our borders against it as it spreads unchecked.
  • I believe that taking a leadership role in stopping Ebola would do a great deal for our self-esteem as a nation, and for our regard in the world.
  • I believe that cost-effectiveness calculations could make a strong case for a “war on Ebola” as the best kind of war that we could wage. I propose we could do more to combat ISIS and protect America by working with the world community to prevent the spread of Ebola in Africa than by any level of commitment of troops and weapons to the enflamed Middle East.

I want America to re-envision Ebola as an opportunity to demonstrate what great things we can do when we bend ourselves to the task.

Of course we are all busy, but perhaps it takes only a little help from many people to spread a really good idea.

Thought for the day. Please grow it and pass it along.

_

I got back some very interesting feedback which I would like to share:

From Randy Ellis (a success story in Nigeria, with lessons for the rest of the world):

Amid so much negative and scary news about Ebola, this research paper on the experience of Nigeria where it has not spread widely after arriving by airplane gives great hope. I recommend it if you have time (It is 6 pages).

Containing Ebola: A success story from an “unexpected” place? [Now linked instead of attached as a pdf]

The author, Kasirim Nwuke  is a BU Ph.D. Here is his bio from one web site.

http://www.elearning-africa.com/profiles/profile_popup.php?address_id=595692&lang=4

_

Then a follow-on from Mead Over, author of a World Bank report (Twitter log  @MeadOver):

This is indeed a good story with details that go beyond the information our World Bank report (in the box on page 29) on the efforts of Senegal and Nigeria that I co-authored on October 7 and blogged on Friday:

http://www.cgdev.org/blog/understanding-world-banks-estimate-economic-damage-ebola-west-africa

http://documents.worldbank.org/curated/en/2014/10/20270083/economic-impact-2014-ebola-epidemic-short-medium-term-estimates-west-africa

The box on page 29 of the WB report was requested by JYK after he sat next to Goodluck Jonathan at the UNGA meeting last week and President Jonathan told him that 1,000 Nigerian public health workers were involved in the contact tracing including almost 300 Nigerian doctors.  This is remarkable not only for the level of effort, but also in comparison to Liberia, Sierra Leone and Guinea each of which had fewer than 100 doctors before the crisis.  In Nigeria I have heard that the polio eradication workers are the ones who were redeployed to do the Ebola contact tracing.  Other countries don’t have the polio program because they don’t have polio.  So even a relatively wealthy country like Ghana may have trouble emulating Nigeria’s success.

I like the point made in the article that Nigeria showed courage in announcing the danger far and wide and rolling out a massive public health effort to contain it.  This was before the rest of the world was taking the epidemic as seriously as they are today, and thus the measures could well have been opposed by economic interests.  (Parallel to HIV:  In the early days of the HIV epidemic, business interests in Thailand opposed the admission that HIV was a problem.  In “Confronting AIDS” we attribute Thailand’s energetic and remarkably successful “100% condom program” partly to the fact that the country was under a military dictatorship for 6 months and the “benevolent dictator” saw the wisdom of opposing the economic interests in order to start that program.)

When I spoke on Ebola at American University the other evening, one of the other panelists was an anthropologist who had recently returned from Sierra Leone.  She also reported the “Ebola handshake” and other “self-isolation behavior from that country.  Epidemiologists are hoping that such behavior, developing in response to the news and the public information campaign, will reduce the reproductive rate of the epidemic.  But we have not seen a deceleration in Liberia or Sierra Leone yet.

Another implication of the author’s account and of the Nigerian and Senegalese public health expenditure amounts reported in the box of the World Bank report is that several West African countries are increasing government spending in response to the outbreak (as is the US).  Our World Bank report does not include the possible stimulus effect of this spending on national economies.  This spending may offset some of the reduction in aggregate demand due to aversion behavior, and thus reduce the economic impact below our estimates.  However, as I say at the end of my blog, unless the epidemic begins to decelerate soon, our “High Ebola” estimate may fall short of estimating the total impact.  And I hope that when Charles Kenny and I join CDC and others in asserting this is still a small problem inside the US, we are not being overly optimistic.  As here:

https://www.youtube.com/watch?v=_jCWkDYwN2g; https://www.youtube.com/watch?v=113kLL3pZQQ

One frustrating aspect of the report by Kasirim Nwuke is the lack of references or hyperlinks [AA: they are now attached in a separate file.]  Even our World Bank report did better.  I agree totally with his conclusion that Nigeria is not yet “safe”.  Each day is another roll of the dice.  In one sense, Nigeria was lucky that they detected the first case on entry.  Next time they may not be so lucky.

_

In response, Kas Nwuke KNwuke@uneca.org wrote (on 10/18/14):

Going through the materials, I have come to know that Nigeria's preparations started much earlier. It started once the outbreak in Guinea and reached full steam after the July ECOWAS Heads of State Summit.  That Summit discussed Ebola in the sub-region and resolved that member States of ECOWAS should be prepared to contain it.  Nigeria according to the Health Minister made, after the Summit, the very first financial donation of $3.5 million US to the three countries.  Back home, the Health Minister briefed the Commissioners for Health in the 36 States of the Federation and asked for increased vigilance.

 

You will find this additional information in the references.

 

In my essay, I had given the number of Nigerians who have volunteered to go to Liberia and Sierra Leone as 200.  I have since learned that the number is actually 591.  In addition, Nigeria is also providing crash courses to health personnel from the three most affected countries.

 

I am sure that lots more will be written about Nigeria experience.  I hope that the lesson can be of value to resource constrained countries on how to handle/tackle epidemics in the future.

 

(I must with regret inform you that Nigeria's election politics has now entered the Ebola debate.  Rivers State and Lagos State are controlled by the opposition.  Electioneering campaign for next year's election has started and the ruling PDP and the opposition APC is each seeking to claim credit for the success in containing the spread of Ebola.  The Rivers State Governor has just disclosed - see the hyperlink - that the state spent N1.106 billion - more than $6 million - to tackle Ebola.)

 

With best wishes,

 

Kas

-

Also, some inspiring information about a UMass colleague (Steven Hatch) now in Liberia:

http://www.nytimes.com/2014/10/17/world/africa/pursuing-a-calling-that-leads-to-west-africa.html

http://www.nytimes.com/2014/10/17/world/africa/ebola-liberia-west-africa-epidemic.html

and a NYT “conspicuous success story” about Senegal, that points to the so far very positive Nigerian experience as well.

-

Also,

NYT, October 19 Op-Ed: “Cuba’s Impressive Role on Ebola” (http://www.nytimes.com/2014/10/20/opinion/cubas-impressive-role-on-ebola.html?_r=0)

Cuba is an impoverished island that remains largely cut off from the world and lies about 4,500 miles from the West African nations where Ebola is spreading at an alarming rate. Yet, having pledged to deploy hundreds of medical professionals to the front lines of the pandemic, Cuba stands to play the most robust role among the nations seeking to contain the virus.

Cuba’s contribution is doubtlessly meant at least in part to bolster its beleaguered international standing. Nonetheless, it should be lauded and emulated.

The global panic over Ebola has not brought forth an adequate response from the nations with the most to offer. While the United States and several other wealthy countries have been happy to pledge funds, only Cuba and a few nongovernmental organizations are offering what is most needed: medical professionals in the field.

The Cuban health sector is aware of the risks of taking on dangerous missions. Cuban doctors assumed the lead role in treating cholera patients in the aftermath of Haiti’s earthquake in 2010. Some returned home sick, and then the island had its first outbreak of cholera in a century. An outbreak of Ebola on the island could pose a far more dangerous risk and increase the odds of a rapid spread in the Western Hemisphere.

Cuba has a long tradition of dispatching doctors and nurses to disaster areas abroad. In the aftermath of Hurricane Katrina in 2005, the Cuban government created a quick-reaction medical corps and offered to send doctors to New Orleans. The United States, unsurprisingly, didn’t take Havana up on that offer. Yet officials in Washington seemed thrilled to learn in recent weeks that Cuba had activated the medical teams for missions in Sierra Leone, Liberia and Guinea.

With technical support from the World Health Organization, the Cuban government trained 460 doctors and nurses on the stringent precautions that must be taken to treat people with the highly contagious virus. The first group of 165 professionals arrived in Sierra Leone in recent days. José Luis Di Fabio, the World Health Organization’s representative in Havana, said Cuban medics were uniquely suited for the mission because many had already worked in Africa. “Cuba has very competent medical professionals,” said Mr. Di Fabio, who is Uruguayan. Mr. Di Fabio said Cuba’s efforts to aid in health emergencies abroad are stymied by the embargo the United States imposes on the island, which struggles to acquire modern equipment and keep medical shelves adequately stocked.

In a column published over the weekend in Cuba’s state-run newspaper, Granma, Fidel Castro argued that the United States and Cuba must put aside their differences, if only temporarily, to combat a deadly scourge. He’s absolutely right.

 

Ebola is being contained in Nigeria

Amid so much negative and scary news about Ebola, this research paper on the experience of Nigeria where it has not spread widely after arriving by airplane gives great hope. I recommend it if you have time (It is 6 pages - updated to include references.).

Containing Ebola: A success story from an "unexpected" place?

The author, Kasirim Nwuke  is a BU Ph.D. Here is his bio from the elearning-aftrica web site.

http://www.elearning-africa.com/profiles/profile_popup.php?address_id=595692&lang=4

Kasirim Nwuke

Kasirim Nwuke is Chief, New Technologies and Innovation at the United Nations Economic Commission for Africa (ECA), Addis Ababa, Ethiopia. He has thought in a number at a number of higher education institutions in the United States of America including Tufts University, Medford, MA; Wellesley College, Wellesley, MA, and Northeastern University, Boston, MA. He been a Research Associate at Harvard University School of Public Health and the a Fellow in African Studies at the African Studies Centre, Boston University. He has held different positions at the United Nations Economic Commission for Africa and as Senior Economic Adviser to the Minister of Finance of the Federal Republic of Nigeria. Kasirim is the author (or lead author) of several research papers and reports and policy briefs on African economic development.  Among books to which he has been a contributing author is "AdricaDotEdu: IT Opportunities and Higher Education in Africa" Maria Beebe et al. Kasirim holds a PhD in Economics from Boston University, Boston, MA.

Former BU professor and World Bank senior economist Mead Over has also been blogging on ebola in west africa. Here is one of his recent blogs.

http://www.cgdev.org/blog/understanding-world-banks-estimate-economic-damage-ebola-west-africa

http://documents.worldbank.org/curated/en/2014/10/20270083/economic-impact-2014-ebola-epidemic-short-medium-term-estimates-west-africa

 

 

 

Important Reposting on Placebo surgery from TIE

I am forwarding this excellent TIE post since every health researcher and indeed every consumer should realize how serious the lack of evidence is on many common surgical procedures. Here are some quotes organized in a succinct way.

"2002... arthroscopic surgery for osteoarthritis of the knee ... Those who had the actual procedures did no better than those who had the sham surgery. " (We still spend $3 billion a year on this procedure)
"2005... percutaneous laser myocardial revascularization, ...  didn’t improve angina better than a placebo"
"2003, 2009, 2009... vertebroplasty — treating back pain by injecting bone cement into fractured vertebrae ... worked no better than faking the procedure."
"2013 ... arthroscopic procedures for tears of the meniscus cartilage in the knee... performed no better than sham surgery" (We do about 700,000 of them with direct costs of about $4 billion.)
"[2014] ... systematic review of migraine prophylaxis [prevention], while 22 percent of patients had a positive response to placebo medications and 38 percent had a positive response to placebo acupuncture, 58 percent had a positive response to placebo surgery.
"2014... 53 randomized controlled trials that included placebo surgery as one option. In more than half of them ... the effect of sham surgery was equivalent to that of the actual procedure."

If you are getting surgery done, do your own research on it and ask questions!

 

-------- Original Message --------

Subject: “The Placebo Effect Doesn’t Apply Just to Pills” plus 1 more
Date: Thu, 9 Oct 2014 11:13:06 +0000
From: The Incidental Economist <tie@theincidentaleconomist.com>
To: <ellisrp@bu.edu>

“The Placebo Effect Doesn’t Apply Just to Pills” plus 1 more


The Placebo Effect Doesn’t Apply Just to PillsPosted: 09 Oct 2014 04:00 AM PDT

The following originally appeared on The Upshot (copyright 2014, The New York Times Company).

For a drug to be approved by the Food and Drug Administration, it must prove itself better than a placebo, or fake drug. This is because of the “placebo effect,” in which patients often improve just because they think they are being treated with something. If we can’t compare a new drug with a placebo, we can’t be sure that the benefit seen from it is anything more than wishful thinking.

But when it comes to medical devices and surgery, the requirements aren’t the same. Placebos aren’t required. That is probably a mistake.

At the turn of this century, arthroscopic surgery for osteoarthritis of the knee was common. Basically, surgeons would clean out the knee usingarthroscopic devices. Another common procedure was lavage, in which a needle would inject saline into the knee to irrigate it. The thought was that these procedures would remove fragments of cartilage and calcium phosphate crystals that were causing inflammation. A number of studieshad shown that people who had these procedures improved more than people who did not.

However, a growing number of people were concerned that this was really no more than a placebo effect. And in 2002, a study was published thatproved it.

A total of 180 patients who had osteoarthritis of the knee were randomly assigned (with their consent) to one of three groups. The first had a standard arthroscopic procedure, and the second had lavage. The third, however, had sham surgery. They had an incision, and a procedure was faked so that they didn’t know that they actually had nothing done. Then the incision was closed.

The results were stunning. Those who had the actual procedures did no better than those who had the sham surgery. They all improved the same amount. The results were all in people’s heads.

Many who heard about the results were angry that this study occurred. They thought it was unethical that people received an incision, and most likely a scar, for no benefit. But, of course, the same was actually true for people who had arthroscopy or lavage: They received no benefit either. Moreover, the results did not make the procedure scarce. Years later, more than a half-million Americans still underwent arthroscopic surgery for osteoarthritis of the knee. They or their insurers spent about $3 billion that year on a procedure that was no better than a placebo.

Sham procedures for research aren’t new. As far back as 1959, the medical literature was reporting on small studies that showed that procedures like internal mammary artery ligation, a surgical procedure used to treat angina, were no better than a fake incision.

In 2005, a study was published in the Journal of the American College of Cardiology proving that percutaneous laser myocardial revascularization, in which a laser is threaded through blood vessels to cut tiny channels in the heart muscle, didn’t improve angina better than a placebo either. We continue to work backward and use placebo-controlled research to try to persuade people not to do procedures, rather than use it to prove conclusively that they work in the first place.

A study published in 2003, without a sham placebo control, showed that vertebroplasty — treating back pain by injecting bone cement into fractured vertebrae — worked better than no procedure at all. From 2001 through 2005, the number of Medicare beneficiaries who underwent vertebroplasty each year almost doubled, from 45 to 87 per 100,000. Some of them had the procedure performed more than once because they failed to achieve relief. In 2009, not one but two placebo-controlled studies were published proving that vertebroplasty for osteoporotic vertebral fractures worked no better than faking the procedure.

Over time, after the 2002 study showing that arthroscopic surgery didn’t work for osteoarthritis of the knee, the number of arthroscopic procedures performed for this condition did begin to go down. But at the same time, the number of arthroscopic procedures for tears of the meniscus cartilage in the knee began to go up fast. Soon, about 700,000 of them were being performed each year, with direct costs of about $4 billion. Less than a year ago, many were shocked when arthroscopic surgery for meniscal tearsperformed no better than sham surgery. This procedure was the most common orthopedic procedure performed in the United States.

The ethical issues aren’t easily dismissed. Theoretically, a sugar pill carries no risk, and a sham procedure does. This is especially true if the procedure requires anesthesia. The surgeon must go out of his or her way to fool the patient. Many would have difficulty doing that.

But we continue to ignore the real potential that many of our surgical procedures and medical devices aren’t doing much good — and might even be doing harm, since real surgery has been shown to pose more risks than sham surgery.

Rita Redberg, in a recent New England Journal of Medicine Perspectives article on sham controls in medical device trials, noted that in a recentsystematic review of migraine prophylaxis, while 22 percent of patients had a positive response to placebo medications and 38 percent had a positive response to placebo acupuncture, 58 percent had a positive response to placebo surgery. The placebo effect of procedures is not to be ignored.

Earlier this year, researchers published a systematic review of placebo controls in surgery. They searched the medical literature from its inception all the way through 2013. In all that time, they could find only 53 randomized controlled trials that included placebo surgery as one option. In more than half of them, though, the effect of sham surgery was equivalent to that of the actual procedure. The authors noted, though, that with the exception to the studies on osteoarthritis of the knee and internal mammary artery ligation noted above, “most of the trials did not result in a major change in practice.”

We have known about the dangers of ignoring the need for placebo controls in research on surgical procedures for some time. When the few studies that are performed are published, we ignore the results and their implications. Too often, this is costing us many, many billions of dollars a year, and potentially harming patients, for no apparent gain.

@aaronecarroll

Share

Placebo historyPosted: 09 Oct 2014 03:00 AM PDT

Here are my highlights from “Placebos and placebo effects in medicine: historical overview,” by Anton de Craen and colleagues. All are direct quotes.

  • In 1807 Thomas Jefferson, recording what he called the pious fraud, observed that ‘one of the most successful physicians I have ever known has assured me that he used more bread pills, drops of colored water, and powders of hickory ashes, than of all other medicines put together’. About a hundred years later, Richard Cabot, of Harvard Medical School, described how he ‘was brought up, as I suppose every physician is, to use placebo, bread pills, water subcutaneously, and other devices’.
  • The word placebo (Latin, ‘I shall please’) was first used in the 14th century. In that period, it referred to hired mourners at funerals. These individuals often began their wailings with Placebo Domino in regione vivorum, the ninth verse of psalm cxiv, which in the Latin Vulgate translation means ‘I shall please the Lord in the land of the living’. Here, the word placebo carries the connotation of depreciation and substitution, because professional mourners were often stand-ins for members of the family of the deceased.
  • In 1801, John Haygarth reported the results of what may have been the first placebo-controlled trial. A common remedy for many diseases at that time was to apply metallic rods, known as Perkins tractors, to the body. These rods were supposed to relieve symptoms through the electromagnetic influence of the metal. Haygarth treated five with imitation tractors made of wood and patients found that four gained relief. He used the metal tractors on the same five patients the following day and obtained identical results: four of five subjects reported relief.
  • In the 1785 New Medical Dictionary, placebo is described as ‘a commonplace method or medicine’. In 1811, the revised Quincy’s Lexicon-Medicum as ‘an epithet given to any medicine adapted defines placebo more to please than to benefit the patient’.
  • In the 1930s, several important papers were published with regard to the introduction of placebos in clinical research. [... Two] papers assessed the value of drugs used in the treatment of angina pectoris in cross-over experiments and deceptively administered placebos to the ‘no-treatment’ comparison group. [...] In both trials the drugs were judged to exert no specific action that might be useful in the treatment of angina. Gold and colleagues tried to explain why inert interventions might work: their points included ‘confidence aroused in a treatment’, the ‘encouragement afforded a new and ‘a of medical by procedure’ change advisor’.
  • Placebo was a fraud and deception that had the ‘moral effect of a remedy given specially for the disease’, but placebos did not affect the natural course of disease; they were a priori excluded from having such an impact. Placebos were therapeutic duds to manage patients, or, as in the Flint investigation, a camouflage behind which to watch nature take its course.
  • In 1938, the word placebo was first applied in reference to the treatment given to concurrent controls in a trial.
  • The efficacy of cold vaccines was evaluated in several placebo-controlled trials. [...] The conclusion [of one] reads ‘one of the most significant aspects of this study is the great reduction in the number of colds which the members of the control groups reported during the experimental period. In fact these results were as good as many of those reported in uncontrolled studies which recommended the use of cold vaccines’. The placebo effect was born.

@afrakt

Share

A model for US: $1 coins and no pennies

I just returned from a vacation in Ecuador (which is spectacular) but wanted to post about a wonderful feature of their monetary system.

Ecuador does not have its own currency but instead uses the US dollar as their only currency. US dollar bills and coins are used everywhere, which is very convenient for visitors. But they do two intelligent things.

* They do not use paper $1 bills, but instead rely almost solely on the US-minted Sacagawea dollar coins for transactions.

* They generally do not use pennies but instead round transactions to the nearest nickel.

(They do mint their own Ecuadorian US-size dimes, quarters and nickels to make up for their shortage, using the same front side but a different reverse. They must have imported millions of $1 dollar coins.)

Wouldn't be nice if the US adopted this system!

 

NEJM: Sham Controls in Medical Device Trials

Rita F. Redberg, M.D.

N Engl J Med 2014; 371:892-893September 4, 2014DOI: 10.1056/NEJMp1406388

(Bold emphasis added by RPE)

The problem:

Only 1% of all medical devices reach the market through the premarket-approval route — the only pathway that requires the submission of clinical data. Research has shown that premarket approvals are often based on data from one small trial that used surrogate end points and included only short-term follow-up.1

RCTs are rarely used:

“Blinded, randomized, controlled trials (RCTs), in which the proposed therapy is compared with a placebo or a “sham” (nontherapeutic) intervention, are common for drugs but rare for medical devices.”

Even complex, RCTs with invasive procedures are possible.

“…double-blind trials of fetal-tissue transplantation for Parkinson's disease, discussed by Freeman et al. (1999). The sham procedure involved making twist-drill holes in the patient's forehead and was considered necessary and ethical for determining whether there was an effect of treatment beyond the placebo effect (there was not).”

“Another important lesson on the value of sham controls came from vertebroplasty, a procedure in which bone cement is injected into a fractured vertebra for treatment of a compression fracture. Vertebroplasty became popular in the early 2000s, on the basis of observational studies and a nonrandomized trial. Fueled by position statements from various U.S. radiologic and neurologic surgical societies arguing the benefits of these procedures, the number of vertebroplasties performed in Medicare patients nearly doubled between 2001 and 2005, increasing from 45.0 to 86.8 per 100,000 enrollees.3 In 2009, however, RCTs that included a group assigned to receive a nontherapeutic procedure found that pain relief in the sham-procedure group was no different from that in the group that received the actual procedure.4

Placebo effects are even larger with procedures than with drugs.

“ Researchers at the Institute of Medical Psychology in Munich recently quantified that power for various types of placebo treatments in studies of migraine prophylaxis. They found that 58% of patients had a positive response to sham surgery and 38% had a positive response to sham acupuncture, while only 22% had a positive response to oral pharmacologic placebos.5

Conclusion: More RCTs are needed. But the article does not address the problem that even with RCTs it is hard to change physician practice.

Full article is here.
http://www.nejm.org/doi/full/10.1056/NEJMp1406388?query=TOC

Employer Sponsored Insurance Also Surged in MA in 2007.

There has been a great deal of surprise expressed in the media over the RAND’s latest report suggesting that more people have become insured through employer sponsored insurance (ESI) than through either Medicaid or the Exchanges under the ACA. One example is Adrianna McIntyre on The Incidental Economist who posted on Wednesday:

"I can’t overstate how stunning this finding is if it’s true; CBO expected that ESI gains and losses would pretty much break even in 2014 and that employer coverage would decline modestly in future years (p. 108)."

This result is precisely NOT stunning if you study the Massachusetts health reform.
In Massachusetts the expansion in ESI coverage ALSO led the total increase during the first year and half. Below is a  table summarizing the early returns in MA from a Massachusetts Division of Health Care Finance and Policy study in 2011.

http://www.mass.gov/chia/docs/r/pubs/11/2011-key-indicators-may.pdf

Notice how growth in ESI dominated both Medicaid and the Exchange in the first two years, before being surpassed by these other two.

I speculate that part of the reason so many Massachusetts employers dropped their plans in 2010 was because they knew they were not
compliant with the ACA new higher standard, but that is speculation. There was also a serious recession that affected employment and enrollment.

Massachusetts Health Reform http://www.mass.gov/chia/docs/r/pubs/11/2011-key-indicators-may.pdf
Insured Population by Insurance Types, 2006-2010
Excluding Medicare
Insured Population by Insurance Type, 2006-2010
June 30 2006 Dec 31 2006 Dec 31 2007 Dec 31 2008 Dec 31 2009 Dec 31 2010
Private Group 4,333,014 4,395,136 4,457,157 4,474,466 4,358,867 4,315,040
Individual Purchase 40,184 38,718 65,465 81,073 114,668 117,514
MassHealth 705,179 740,663 764,559 780,727 848,528 898,572
Commonwealth Care 0 18,327 158,194 162,725 150,998 158,973
Total Members 5,078,377 5,192,814 5,445,375 5,498,991 5,473,061 5,490,099
Change since 6/30/2006 June 30 2006 Dec 31 2006 Dec 31 2007 Dec 31 2008 Dec 31 2009 Dec 31 2010
Private Group 62,122 124,143 141,452 25,853 -17,974
Individual Purchase -1,466 25,281 40,889 74,484 77,330
MassHealth 35,484 59,380 75,548 143,349 193,393
Commonwealth Care 18,327 158,194 162,725 150,998 158,973
Total Members 114,437 366,998 420,614 394,684 411,722
Distribution of new enrollment as a fraction of total gains June 30 2006 Dec 31 2006 Dec 31 2007 Dec 31 2008 Dec 31 2009 Dec 31 2010
Private Group 54% 34% 34% 7% -4%
Individual Purchase -1% 7% 10% 19% 19%
MassHealth 31% 16% 18% 36% 47%
Commonwealth Care 16% 43% 39% 38% 39%
Total Members 100% 100% 100% 100% 100%

 

Explaining these two graphs should merit a Nobel prize

Reposting from The Incidental Economist Blog

What happened to US life expectancy?

Posted: 07 Jan 2014 03:00 AM PST

Here’s another chart from the JAMA study “The Anatomy of Health Care in the United States”:

life expectancy at birth

Why did the US fall behind the OECD median in the mid-1980s for men and the early 1990s for women? Note, the answer need not point to the health system. But, if it does, it’s not the first chart to show things going awry with it around that time. Before I quote the authors’ answer, here’s a related chart from the paper:

ypll

The chart shows years of potential life lost in the US as a multiple of the OECD median and over time. Values greater than 1 are bad (for the US). There are plenty of those. A value of exactly 1 would mean the US is at the OECD median. Below one would indicate we’re doing better. There’s not many of those.

It’d be somewhat comforting if the US at least showed improvement over time. But, by and large, it does not. For many conditions, you can see the US pulling away from the OECD countries beginning in/around 1980 or 1990, as was the case for life expectancy shown above. Why?

The authors’ answer:

Possible causes of this departure from international norms were highlighted in a 2013 Institute of Medicine report and have been ascribed to many factors, only some of which are attributed to medical care financing or delivery. These include differences in cultural norms that affect healthy behaviors (gun ownership, unprotected sex, drug use, seat belts), obesity, and risk of trauma. Others are directly or indirectly attributable to differences in care, such as delays in treatment due to lack of insurance and fragmentation of care between different physicians and hospitals. Some have also suggested that unfavorable US performance is explained by higher risk of iatrogenic disease, drug toxicity, hospital-acquired infection, and a cultural preference to “do more,” with a bias toward new technology, for which risks are understated and benefits are unknown. However, the breadth and consistency of the US underperformance across disease categories suggests that the United States pays a penalty for its extreme fragmentation, financial incentives that favor procedures over comprehensive longitudinal care, and absence of organizational strategy at the individual system level. [Link added.]

This is deeply unsatisfying, though it may be the best explanation available. Nevertheless, the sentence in bold is purely speculative. One must admit that it is plausible that fragmentation, incentives for procedures, and lack of organizational strategy could play a role in poor health outcomes in the US — they certainly don’t help — but the authors have also ticked off other factors. Which, if any, dominate? It’s completely unclear.

Apart from the explanation or lack thereof, I also wonder how much welfare has been lost relative to the counterfactual that the US kept pace with the OECD in life expectancy and health spending. It’s got to be enormous unless there are offsetting gains in areas of life other than longevity and physical well-being. For example, if lifestyle is a major contributing factor, perhaps doing and eating what we want (to the extent we’re making choices) is more valuable than lower mortality and morbidity. (I doubt it, but that’s my speculation/opinion.)

(I’ve raised some questions in this post. Feel free to email me with answers, if you have any.)

@afrakt

Personal experience with the new Federal Exchange web site

Randall P. Ellis, Professor, Boston University Department of Economics and past president of the American Society of Health Economists.

Today, Tuesday Dec 3, I went on line to check out the new HealthCare.gov web site for selecting individual health insurance. I checked out options for enrolling in the Oxford County Maine. The web page now has a totally new feel and look to it. Most importantly, it allowed me to shop for different plan options without having to first pass through the extensive security barriers which used to prevent people from shopping until they established eligibility. Now, it is attractive and better than the Massachusetts exchange.

I clicked through 50 screens, and dozens of plans in the middle of Tuesday morning with no noticeable delays or glitches. (The Boston University benefits web site gave me more problems in recent weeks.)

The options look terrific to me, although I am covered at work and hence not eligible to enroll through the exchanges.

The premium in rural Oxford County Maine for a 20 year old in the lowest cost option is only $110 per month, without any government subsidy. That is astoundingly low compared to the overpriced policies that were previously available.

I also priced out a gold plan (Community Advantage) comparable to my coverage at Boston University for a family of three. Without any subsidy, that plan would be $1799 per month. The Anthem Blue Cross Blue Shield Gold Guided Access plan was $2013/month. At BU I am currently paying $1813 per month. So these two plans look reasonable to me in comparison. Of course my employer subsidizes my coverage, and many will be eligible for subsidies from the ACA or their employer.

Also new is the link to the Kaiser Family Foundation calculator, which allows the user to get an estimate of any savings that he or she is eligible for based on income and family size. I played with it for a while, and it worked well. I quickly used that calculator to calculate that a 39 year old in Oxford Maine earning $30,000 per year could expect to pay $3790 per year, and then receive a tax credit of $1278, bringing the total cost to $2512 per year which is 8.37%.

This new interface makes shopping on the exchanges simple and easy to understand.

Although terribly unpleasant, the flaws in the initial Healthcare.gov system promoted awareness and discussion in the media about the new exchanges, which is good. It also encouraged employers to step forward and offer coverage instead of relying on individuals. Both of these are very positive outcomes.

I predict that enrollments through the exchanges by the end of December will be below the initial, optimistic forecasts of the administration, but that millions more will enroll in early 2014 as people fill out their tax forms and are prompted to answer whether they have health insurance. In Massachusetts, that was a greater motivation to purchasing than the end of the calendar year.

Playing video games does not predict voilent behavoir in children

(Reposted from The Incidental Economist) This November 2013 UK study confirms what other studies have shown, which is that playing video games does not predict psychosocial adjustment problems in young children. Even watching 3 hours of TV per day in the UK has no meaningful association.

I also reposted my favorite graph about videos and gun violence from an earlier TIE posting.

Perhaps the 50th anniversary of  JFK's death, done with a $20 mail order rifle, is yet another good time to refocus on gun control.

Happy Thanksgiving!

Randy

The dangers of TV and video games
Posted: 25 Nov 2013 06:01 AM PST
From Archives of Diseases of Childhood, “
Do television and electronic games predict children’s psychosocial adjustment? Longitudinal research using the UK Millennium Cohort Study
“:

BACKGROUND: Screen entertainment for young children has been associated with several aspects of psychosocial adjustment. Most research is from North America and focuses on television. Few longitudinal studies have compared the effects of TV and electronic games, or have investigated gender differences.

PURPOSE: To explore how time watching TV and playing electronic games at age 5 years each predicts change in psychosocial adjustment in a representative sample of 7 year-olds from the UK.

METHODS: Typical daily hours viewing television and playing electronic games at age 5 years were reported by mothers of 11 014 children from the UK Millennium Cohort Study. Conduct problems, emotional symptoms, peer relationship problems, hyperactivity/inattention and prosocial behaviour were reported by mothers using the Strengths and Difficulties Questionnaire. Change in adjustment from age 5 years to 7 years was regressed on screen exposures; adjusting for family characteristics and functioning, and child characteristics.

RESULTS: Watching TV for 3 h or more at 5 years predicted a 0.13 point increase (95% CI 0.03 to 0.24) in conduct problems by 7 years, compared with watching for under an hour, but playing electronic games was not associated with conduct problems. No associations were found between either type of screen time and emotional symptoms, hyperactivity/inattention, peer relationship problems or prosocial behaviour. There was no evidence of gender differences in the effect of screen time.

CONCLUSIONS: TV but not electronic games predicted a small increase in conduct problems. Screen time did not predict other aspects of psychosocial adjustment. Further work is required to establish causal mechanisms.

Since we’re never going to have an RCT of TV or video games, these kinds of prospective cohort studies are important. In this one, they followed more than 11,000 children in the UK. They found that watching TV for three hours or more (a day!) at 5 years associated with a higher chance of having a conduct disorder at 7 years versus kids who watched less than an hour a day. How much of a difference? A 0.13 point increase in conduct problems. That corresponds, according to the article, to “0.09 of a SD [standard deviation] increase in age 7 years conduct score. Do you understand now? I don’t either.Anyway, the authors said it was a “small increase in conduct problems”.Video games? No effect.Yes, these are young kids, and it’s unlikely that they have been playing much GTA 5 or Battlefield 4. So I’ll look forward to more data. But that this point, it’s hard to point to a large study like this and find a smoking gun. Figuratively or literally.More on this topic here and here.@aaronecarrollShare

This is my favorite graph on this topic. From here

http://theincidentaleconomist.com/wordpress/wp-content/uploads/2012/12/video-game-chart-no-trendline.jpg

Two great reposts from TIE/JAMA

This repost from The Incidental Economist (TIE) is one of the best summaries of US Health Care I have seen. I also appended the Uwe posting at the bottom.

(The JAMA Authors are Hamilton Moses III, MD; David H. M. Matheson, MBA, JD; E. Ray Dorsey, MD, MBA; Benjamin P. George, MPH; David Sadoff, BA; Satoshi Yoshimura, PhD

The JAMA Article, which has an abundance of tables, references and graphs, will be on my MA and Ph.D. reading lists.

Anyone interested in keeping up with current US health policy from an economists point of view should subscribe to TIE, although it can be distracting, frustrating, and time consuming.

Randy

Study:The Anatomy of Health Care in the United States

Posted: 13 Nov 2013 03:55 AM PST

From JAMA. I reformatted the abstract, and broke it up into paragraphs to make it easier to read:

Health care in the United States includes a vast array of complex interrelationships among those who receive, provide, and finance care. In this article, publicly available data were used to identify trends in health care, principally from 1980 to 2011, in the source and use of funds (“economic anatomy”), the people receiving and organizations providing care, and the resulting value created and health outcomes.

In 2011, US health care employed 15.7% of the workforce, with expenditures of $2.7 trillion, doubling since 1980 as a percentage of US gross domestic product (GDP) to 17.9%. Yearly growth has decreased since 1970, especially since 2002, but, at 3% per year, exceeds any other industry and GDP overall.

Government funding increased from 31.1% in 1980 to 42.3% in 2011. Despite the increases in resources devoted to health care, multiple health metrics, including life expectancy at birth and survival with many diseases, shows the United States trailing peer nations. The findings from this analysis contradict several common assumptions. Since 2000,

  1. price (especially of hospital charges [+4.2%/y], professional services [3.6%/y], drugs and devices [+4.0%/y], and administrative costs [+5.6%/y]), not demand for services or aging of the population, produced 91% of cost increases;
  2. personal out-of-pocket spending on insurance premiums and co-payments have declined from 23% to 11%; and
  3. chronic illnesses account for 84% of costs overall among the entire population, not only of the elderly.

Three factors have produced the most change:

  1. consolidation, with fewer general hospitals and more single-specialty hospitals and physician groups, producing financial concentration in health systems, insurers, pharmacies, and benefit managers;
  2. information technology, in which investment has occurred but value is elusive; and
  3. the patient as consumer, whereby influence is sought outside traditional channels, using social media, informal networks, new public sources of information, and self-management software.

These forces create tension among patient aims for choice, personal care, and attention; physician aims for professionalism and autonomy; and public and private payer aims for aggregate economic value across large populations. Measurements of cost and outcome (applied to groups) are supplanting individuals’ preferences. Clinicians increasingly are expected to substitute social and economic goals for the needs of a single patient. These contradictory forces are difficult to reconcile, creating risk of growing instability and political tensions. A national conversation, guided by the best data and information, aimed at explicit understanding of choices, tradeoffs, and expectations, using broader definitions of health and value, is needed.

My frustration? That anyone treats any of this as news. At some point we need to stop diagnosing the problem and start doing something about it.

The whole thing is worth a read. But none of it will be news for regular visitors to TIE. Why isn’t everyone reading this blog already?!?!?!

@aaronecarroll

Quote: Uwe (Need I say more?)

Posted: 13 Nov 2013 04:00 AM PST

[T]he often advanced idea that American patients should have “more skin in the game” through higher cost sharing, inducing them to shop around for cost-effective health care, so far has been about as sensible as blindfolding shoppers entering a department store in the hope that inside they can and will then shop smartly for the merchandise they seek. So far the application of this idea in practice has been as silly as it has been cruel. [...]

In their almost united opposition to government, US physicians and health care organizations have always paid lip service to the virtue of market, possibly without fully understanding what market actually means outside a safe fortress that keeps prices and quality of services opaque from potential buyers. Reference pricing for health care coupled with full transparency of those prices is one manifestation of raw market forces at work.

-Uwe Reinhardt, The Journal of the American Medical Association. I thank Karan Chhabra for the prod.

@afrakt

AHRF/ARF 2012-13 data is available free

AHRF=Area Health Resource File (Formerly ARF)

2012-2013 ARHF can now be downloaded at no cost.

The 2012-2013 ARF data files and documentation can now be downloaded. Click the link below to learn how to download ARF documentation and data.

http://arf.hrsa.gov/

“The Area Health Resources Files (AHRF)—a family of health data resource
products—draw from an extensive county-level database assembled annually from
over 50 sources. The AHRF products include county and state ASCII files, an MS Access
database, an AHRF Mapping Tool and Health Resources Comparison Tools (HRCT). These
products are made available at no cost by HRSA/BHPR/NCHWA to inform health resources
planning, analysis and decision making..”

"The new AHRF Mapping Tool enables users to compare the availability of healthcare providers as well as environmental factors impacting health at the county and state levels."

I thank Saikat Kundu  for bringing to my attention the following NYT article.  I pasted two short excerpts below.


July 31, 2013, 10:20 am

Revealing a Health Care Secret: The Price

By TINA ROSENBERG

"The Surgery Center of Oklahoma is an ambulatory surgical center in Oklahoma City owned by its roughly 40 surgeons and anesthesiologists. What makes it different from every other such facility in America is this: If you need an anterior cruciate ligament reconstruction, you will know beforehand — because it’s on their Web site — that it costs $6,990 if you self-pay in advance. If you need a tonsillectomy, that’s $3,600. Repair of a simple closed nasal fracture: $1,900. These prices are all-inclusive."
...

 

"Why are health care costs so high? It’s not because of quality; numerous studies have failed to find any correlation between price and quality. Nor is price a function of hospital costs — not when one facility in Oklahoma City can charge 7.5 times what another charges for the same procedure.

One of the most important reasons has to do with the political and market power of health care providers, who are essentially able to name their charges. The foundation of that system is the fact that only sellers, and not buyers, know the price. If prices are secret, patients can’t comparison shop. There is no way to push prices down, or force providers to compete on price. Price secrecy hides the need for reform. “Getting prices out in the open is crucial to bringing prices down,” said Katherine Hempstead, senior program officer at the Robert Wood Johnson Foundation."

http://opinionator.blogs.nytimes.com/2013/07/31/a-new-health-care-approach-dont-hide-the-price/?_r=0