Saturday, July 29, 2017

J11 - Economics and Time (Part I)


My first Self-Designed Assignment is complete, but it’s not everything I wanted it to be. I stand by what I wrote, but I do think that there’s something missing. I wasn’t sure how to integrate the time element into my definition of economics when the definition, as it stood, seemed coherent enough to stand on its own, and the time pressure that I imposed on myself caused me to not take the extra time to try to find a way to relate economics back to time. I’ve decided to rectify that problem by elaborating on the economics of time here, in my Journals. 

Economics is really all about change. Acting man peers into the future and tries to ascertain what conditions of existence he will soon face. But these future conditions are ones that will come about through the natural operation of the world, without action. Finding these conditions to be unsatisfactory, man acts now, in the present, interfering with the natural flow of events in order to change the approaching future conditions and render them more satisfactory for himself. In this way, all action is future-oriented. We cannot change the present. Only the future, whether that future is so near that its arrival seems instantaneous or so far that the actor will never live to see it. Action is man’s effort to become better off, but to become implies a change in conditions, and change can only occur through time. Economics, then, may also be defined as the study of man’s attempts to change the future. 

Time is also the fundamental essence of capital. Capital is a product of time: as a produced means of production, time was required to bring it into existence. However, capital is more accurately an advancement in time. This may seem counterintuitive, since capital is created not for immediate consumption, but to aid in the production of other consumer goods. It seems, therefore, that the production of capital would lengthen the production process and cause more time to pass between the moment the actor set about to attain his end and the moment of attainment. This is half true: investment in capital does lengthen the production process, but not the amount of time that passes between the start of production and the attainment of the final product. This is because capital is an advancement in time.

For example, suppose that Robinson Crusoe is stuck on a deserted island. The only food source on the island are berries. Every day, Crusoe gathers what berries he can pick and consumes them. Suppose that Crusoe can pick 100 berries in an hour, and that he spends 10 hours a day picking berries. One day, however, Crusoe forgoes the consumption of berries, and spends his 10 working hours fashioning a stick with which he can shake berries out of the bushes and a basket with which he can collect the fallen berries. The next day, armed with his tools, Crusoe manages to gather 300 berries an hour, and, therefore, ends his day with 3,000 berries. The production process was lengthened because Crusoe did not immediately start producing berries. He produced a stick and basket, capital goods, before producing the berries, consumer goods. However, without the capital goods, Crusoe would have required 30 hours to collect 3,000 berries. Even though the capital goods required time to create, they still saved him 10 hours of work. 

Capital brings the actor closer to the attainment of his end. That is their value, that is their advantage. The production process is lengthened by the accumulation of capital, but that capital provides the shortest, most direct path to the attainment of the end sought. The difference between a man with capital and a man without is that the man with capital will attain his ends sooner than the man without capital. Capital is time.

Capitalism, then, is about time. The free economy becomes the capitalist economy when people begin to orient their actions towards the future, investing in intermediary goods (capital) rather than always seeking direct consumption. However, the very existence of a market presupposes an orientation towards the future, since in a market economy people produce a surplus of goods with the expectation that this surplus can be traded for the surplus goods of others at some future time. Therefore, a market economy is a capitalist economy. The existence of trade indicates a future orientation and a level of capital accumulation that allows even such a microsystem to be called capitalist. Capitalism, after all, is just the economical actions of multiple individuals.

But this means that capitalism is about change. About changing the future. About making the future a better place. And, indeed, this is the purpose of all action. Again, we see that capitalism is the nothing more than the efforts of free men and women trying to improve their lives, or, more accurately, to improve their futures. Economics, as the study of man is his efforts to improve the future, is the study of capitalism, and the factors that influence its functioning.

Sunday, July 9, 2017

J10 - The Place of Economics

*This journal is dedicated to Febronia Mansour, who, in the course of setting up an interview with me, has forced me to do much thinking on many topics.*


I knew, at the outset of this project, that I would be hard-pressed to dedicate the time and energy to it that I wanted to. My experience of the past few weeks has confirmed that prediction. I leave my house before 7:30am, I get home after 10:00pm, and then I run for 1-4 hours. Every day. This leaves very little time for my project. I have been working on my first Self-Designed Assessment, which will be a paper that attempts to redefine some of the foundational concepts in economics so as to gain greater insights into their meanings. However, this “work” has proceeded one paragraph at a time, usually slipped in at the end of a very long day. Therefore, although a first draft has been completed, I need to find a long span of free time when I can sit down and fit together all the disjointed paragraphs into a more coherent whole. While I’m waiting for such a block of time to appear, I thought I’d use this journal to briefly discuss the science of economics itself.


In the past, I’ve defined economics as the study of human action (purposeful behavior) under conditions of scarcity. This definition was meant to contrast itself with the mainstream definitions of economics, which usually have something to do with wealth or material conditions, and to thereby highlight the broader scope of properly-understood economics. However, after some recent thinking on the subject, I’ve only just come to realize how broad this subject truly is. Economics is the study of what makes us human, as opposed to stones or lower life forms. 

Every science is a search for knowledge, but, more specifically, every science is a search for knowledge that can serve man. As I explained in my post on the 5Cs, man seeks knowledge for a specific purpose, so that he can accomplish some task. Science is an attempt to trace every phenomenon back to its cause, to provide an understanding of cause and effect, so that man can bring about his desired changes in the world. 

Now, man’s freedom to act is constrained by three sets of laws: physical, physiological, and praxeological. The physical sciences seek to understand and elucidate how the world works. The life sciences seek to understand and elucidate the functions of life. And the human sciences seek to understand and elucidate the distinctive characteristic of human activity, i.e., the purpose behind it. 

The sciences are discovering rules that apply to us, as they must if they are to properly serve us. The laws of physics apply to us because we are matter and energy, and are useful to the extent that we seek to shape matter and energy. The laws of biology apply to us because we are alive, and are useful to the extent that we seek to improve and preserve life. And the laws of economics apply to us because we are human, and are useful to the extent that we seek to interact with and influence our fellow man. 

Economics, then, is far more than some form of business theory, or a mere means of explaining the wealth of nations, as many people believe. Economics is a principal science, the study of man as a human being. The knowledge it provides is applicable to all fields of human activity. Moreover, because the purpose of all science is to serve man, at least a basic understanding of man must precede any scientific venture. Therefore, not only does economic knowledge apply to every field of human activity; every field of human activity requires economic knowledge. 

“[Economics] is the philosophy of human life and action and concerns everybody and everything.  It is the pith of civilization and of man’s human existence.” -Ludwig von Mises


Saturday, July 8, 2017

A Note on FDR

I, of all people, know that public schools often function as indoctrination camps. However, it still took me by surprise when, on June 21, the EMC2 students named Franklin Roosevelt as “the greatest president.” There was no right answer to the question I posed to the students, but FDR was definitely a wrong one.

Of course, the students knew that FDR had ordered the internment of Japanese Americans during the Second World War, and faulted him for that. But they still chose him as the greatest president because he “ended the Great Depression” and “led us through WWII.” But such a simple analysis ignores many other crimes of FDR, and, quite frankly, attributes to FDR achievements that were not his. 

FDR did not end the Great Depression. In fact, he prolonged and worsened it. Yes, he certainly “took action” in trying to relieve the economic ills faced by most Americans during the 30s, but these actions were misguided. Roosevelt spent more during his presidency than all previous presidents combined, and he instituted hundreds of new programs and regulatory schemes that were ostensibly designed to end the downturn, all to no avail. The unemployment rate was higher in 1938 than it was when Roosevelt was elected, in 1932. It wasn’t until WWII that the unemployment rate returned to semi-normal levels. This was helped, no doubt, by the fact that a fifth of American men were drafted into war. But some of Roosevelt’s schemes were just ghastly: the AAA, for example, operating on the assumption that prices were too low and must be raised by any means, attempted to reduce the supply of crops and livestock provided by the farms. Yes, at a time when people were starving and wearing out their hand-me-downs, FDR drastically increased the cost of food and clothing. This also exacerbated the unemployment problem: some two million farmhands lost their jobs because the government was paying farmers not to grow anything.

Moreover, the massive amount of money that Roosevelt spent on public works projects, which is something everyone seems to love, was financed ultimately by tax dollars. Everyone can cheer when the government spends money, but they all too often forget that every dollar the government spends is a dollar that the government took. Money spent by the Tennessee Valley Authority on new dams was money that couldn’t be spent by American businesses on other, actually profitable, projects. Other New Deal programs, such as the minimum wage, social security, strengthening of labor unions, increased taxes on the rich, nationalized industries, etc., contributed to an atmosphere of what Dr. Robert Higgs has termed “regime uncertainty,” which makes businesses less likely to invest in new projects because they’re worried about what the future holds for the security of their property rights. Indeed, private investment was at an exceptionally low rate throughout the Great Depression. It wasn’t until WWII created reliable income streams for certain industries that investment increased.

Something that all the students probably heard about in school but did not mention during the exercise (that I heard) was Roosevelt’s court-packing plan. During the first years of his presidency, the Supreme Court again and again struck down unconstitutional measures that were passed as part of the New Deal. In response, Roosevelt threatened to add a number of justices to the court, so that the majority would be in his favor. This plan was never instituted, but four justices retired from the court shortly afterward, and so Roosevelt got his majority anyway. This is the same majority that gave us such horrible cases as Wickard v. Filburn, which held that growing some corn yourself on your own property to feed your own animals constituted interstate commerce, and was therefore subject to regulation by the Department of Agriculture. The decisions of the Roosevelt court haunt us to this day.

A crime that most students probably haven’t heard of is perhaps the most important. In April of 1933, Roosevelt ordered the confiscation of every ounce of gold held by private entities in the U.S. He also embraced the Thomas Amendment, which gave him the power to change the gold content of the dollar. The dollar had been set equal in value to about 1/20th an ounce of gold in 1832. For more than a century, there had been zero net inflation in the United States. After the actions of FDR, severing the relationship between the dollar and gold, control over money was held by the federal government, who has manipulated it in such a way that the dollar has lost more than 97% of its purchasing power. Inflation hurts many Americans; its prominence in the American economy began with FDR. 

Finally, Roosevelt’s role in WWII was not what it appeared to be. The attack on the Greer, for instance, was provoked; the Greer had been tracking the German submarine for hours before the attack, and fired first. Pearl Harbor was partially instigated by the economic warfare that FDR had been waging against the Japanese through his strictly nationalist trade policies. In fact, there is evidence that FDR was trying to provoke the Japanese to attack, and knew about the impending bombing of Pearl Harbor ten days before it occurred. He also made necessary the use of atomic weapons for the first time in history. The classic justification for the use of the atomic bomb is that it saved more lives than it took, both Japanese civilians and American soldiers. But this is only because the U.S., under Roosevelt’s direction, was demanding total surrender. If FDR had been willing to accept a normal surrender, under which Japan could have kept its Emperor, there would have been no need for the destruction of Hiroshima and Nagasaki. 

I will give credit where credit is due: Roosevelt did end Prohibition. Now people can legally drown all of their sorrows caused by the other results of his reign. “Hurrah!”

Saturday, July 1, 2017

The Credential

*Adapted from a speech given in May, 2016.*

During the past two weeks I’ve had four conversations with six different individuals about the disappointing quality of higher education. It’s boring, it’s too easy, the professors are bad, it’s obscenely expensive, no one is taking the classes seriously, it’s not what students were expecting. Compounding this frustration, of course, is the fact that the college degree is not as valuable in the job market today as it was in days past. But college wasn’t always like this (after all, where do our misconceptions come from?). In the past, colleges really were institutions of higher learning, effectively educating students in a stimulating environment, and their graduates were basically guaranteed a good job after graduation. What happened? Why the drastic decline in quality? 

What’s changed is the significance of a college degree. Way back when, it was only the academically-inclined students who attended college because it was those students who wanted to learn more and who would end up applying the advanced knowledge that they were learning. A bachelor’s degree was a certificate; it certified that the student had advanced knowledge in a specific area. However, the degree also had another function. It was also a credential; it signaled that the student who possessed it had the potential to create more value than a student without it. By virtue of the greater intelligence and four years of rigorous work in college, employers identified college graduates as desirable employees, capable of doing more than the high school graduate. As a result, college graduates were courted and paid a higher salary than non-college graduates.

Of course, everyone else saw what was going on, and a movement was started such that more and more students pushed themselves to get into and get through college and earn that bachelor’s degree. Attendance increased (read: demand for college increased) and, as a result, the price of college increased. Still, attendance increased, and tuition continued to rise. Eventually, the government stepped in and started pouring money into programs that would fund students’ educations, artificially lowering the price and causing attendance to soar, tuition following. As tuition increased, proportionately more so for many of the students who would have gone purely for the learning, it became harder and harder to justify going to college, except as an investment in the future. The certificate function of a college degree, the learning component, was pushed aside, forgotten in all but rhetoric, to make room for the focus on the credential aspect of the college degree and its ability to get the student a good job.

Now we can see why professors and students often don’t click. Ninety percent of the students in a professor’s class don’t actually care about learning. They’re in class merely because they’re going through the motions, doing what they need to do to graduate and receive that credential. The professors have demonstrated, by getting their Ph.D., that they only care about the learning that happens in college. And yet, when they receive their Ph.D. and get their teaching gig, they arrive to find an over-occupied lecture hall filled with students who are utterly uninterested in what the professor has to say. How long do you think it takes before the professor stops caring about the quality of teaching? (The proof of this can be found by visiting a professor during their office hours. If you stop in with a substantive question, the professor is liable to leap from his chair and shout in excitement before giving you a very detailed answer and following up with practice questions. They are passionate about what they’re teaching, but the classroom full of uncaring students is not conducive to eliciting that passion.)

Let’s analyze, for a moment, how a Bachelor’s degree acts as a credential. Employers don’t actually care about what you learned in college. For most jobs, the major is irrelevant. Indeed, on most job postings the requirements include a Bachelor’s degree, but the listing hardly ever specifies what kind of Bachelor’s degree is required. That’s because there is no such requirement. The company is prepared to train you in how to perform the job; your years studying biology may or may not help you do your job, but you’ll get the same training nonetheless. Ultimately, the company only cares about your ability to do your job, your ability to create value for their company. Here’s the thing: it can’t know that you’ll create value, and it certainly can’t know that you’ll create more value than the other applicant. It doesn’t know you or your abilities. That’s essentially what the entire application process is: you trying to prove your ability to create value to the company that you’re applying to. You send in your resume, which is a list of your accomplishments, a list of things you’ve done. You provide a list of references, people that you’ve previously worked for and who can vouch for your ability to create value. And this is where the college degree comes in, during the application process. The company doesn’t know if it can trust you when you say that you can create value, and it doesn’t know if it can trust your references. But the college degree is really your college vouching for you. The employer may not have heard of you, but they’ve probably heard of the State University of New York, and your college degree is SUNY telling the potential employer “You don’t have to trust the applicant, but trust us. We spent four years verifying this student’s ability to create value.” In this day and age, that is all a college degree is: a credential of your ability to create value. 

However, as more and more people recognize this function of the college degree, more and more people are acquiring degrees. These increased numbers are coming from a previously submarginal group of students, students who wouldn’t have gotten the degree if it wasn’t a credential. The government’s assistance further increases this trend, drawing deeper and deeper from the submarginal group. This results in massive amounts of students who don’t care about learning and who aren’t capable of as much as the students who should be in college. Like I said above, this affects the professors, and the quality of college, and eventually, after a few cycles of this, college isn’t what it used to be. 

This actually weakens the power of a college degree. First, as the market is flooded with Bachelor’s degrees (read: as the supply of Bachelor’s degrees increases), their purchasing power decreases. Second, the quality of the student who possesses a Bachelor’s degree declines, which means his ability to create value is not as great as might be assumed at first. (Think about it: everyone in your class will graduate with the same exact degree as you. How many of your classmates are on Facebook, on their phones, asleep? How many showed up today?) This, of course, creates a dire problem for those of us who actually are capable of creating real value. A weakened credential hurts us as well, even if we weren’t the one to weaken it. Again, the employer doesn’t know us. If you want to distinguish yourselves as more capable of value-creation than the other guy (who now has the same degree as you), you’ll need something more than the Bachelor’s degree. 

It has now become necessary for you to create your own credential. 

Sure, there are other ways of differentiating yourself from others. Many students these days go on to get graduate degrees, hoping that this extra credential will be enough to get them the job they want. But I would venture to say that there is a profoundly better way to demonstrate your ability to create value: create value. Build something! Start a blog and write regularly, thereby showcasing your knowledge about a topic and your communication abilities. Take a trip to the hardware store and start building a wind turbine so that you can test various energy theories, and upload your experiments on YouTube. Write a story and publish a book through Amazon. Learn Java and create your own app for the market. Learn about overpronation and prepare a pamphlet for distribution in all NYS School Athletic Facilities. Use LinkedIn to record your achievements at work, such as increasing revenue by 33%. It doesn’t matter what you do, what you build; all that matters is that you do, build, something. If you want a good job, your employer needs to know that you can create value. Millions and millions of students are rushing to purchase a credential from a college that they can use to prove that they can create value. Imagine how powerful an impact your application will have if your credential is not a credential, but an actual demonstration of value-creation. 

I’m not saying that you shouldn’t go to college, or that you shouldn’t care about college. But I am saying that the college degree is no longer enough to ensure you your dream job, and it’s no longer a certificate of intellectual prowess. It is a shadow of what it once was, and this trend will only continue. If you want to have an edge in the job market and in life, you’ll need more than the Bachelor’s degree can give you. You’ll need another credential. Your own.