Wednesday, June 30, 2010

Student Evaluations

Just this morning, while clearing off my desk, I leafed through my teaching evaluations before filing them away. As I do with declined proposals, I usually let some time pass after the end of the semester before reading them.

Our department very reasonably allows each instructor to use a unique format that each of us feels most appropriate. My large introductory class evaluations are designed like a product survey, where I ask the students to rank various aspects of my teaching, followed by an area where they are free to make comments. I take the numerical rankings with a grain of salt, but read the comments carefully. They can be inane, but sometimes I get useful comments. For example, one student pointed out that color blind individuals had trouble seeing the red markers that I sometimes used. I wish that someone had pointed this out earlier.

In the most advanced classes, I solicit an essay rather than have graduate students check boxes. Being conditioned to fill out surveys, even some of the most sophisticated students are compelled to rate me numerically.

Our university is moving in the direction of a standardized forms. Some colleges at my university already have done so. When I complain that it makes no sense to have freshman students who are taking a general education requirement, and graduate students who are taking advanced quantum mechanics to answer the same survey questions, I am condescendingly reminded that I can add additional questions or leave room for an essay.

Contrary to firm denials by administrators, teaching quality is often quantified by one number. I was once a member of a tenure and promotion committee in a college that uses standardized forms. Faculty X, who had an average evaluation of 3.2/4, was deemed a better instructor than Faculty Y, who had an average score of 3.1/4. The fact that courses taught by these two faculty members were at different levels and had students of differing demographics was not taken into account. It is not unexpected that a major taking an advanced course will rate an instructor differently than an introductory student who is in a crowded lecture hall with hundreds of other students.

Hiding behind numbers are a convenient way to excuse oneself from the responsibility of making a judgment. When I did annual reviews for our faculty, I read every comment - even in classes with 200 students. One particular faculty member had a low numerical score for his teaching. A small majority of comments said that the course was too hard. Another set of responses stated that the course was challenging but very interesting, and that this particular professor took extra time in class to explain difficult concepts. Also, he was available in his office whenever the students wanted to talk. The comments as a whole painted a much more positive picture than his scores alone. In contrast, a high-scoring faculty member had comments to the effect that "this class was fun." There were few comments that mentioned anything about learning or being challenged. I fear that focusing on a single number generated by students on a standardized form will encourage superficial teaching. We are already sending a message that if the material is hard, there is something wrong with the teacher.

My teaching style has changed over the years, especially in the advanced graduate classes. I used to prepare my notes very carefully and tried to be very precise in my delivery. Given the complexity of the subject matter, I was effectively copying from my notes to the blackboard to avoid errors: I was not thinking, but rather reciting. Though I had very good teaching evaluations, I felt that the students were simply creating a transcript of my lecture. They were not thinking in real time.

As I got older, and felt less obligated to please the students, I changed my style. Rather than read from my notes, I prepare very carefully for each lecture, making sure that I understand the ideas that I am trying to convey as well as the critical parts of derivations. Since the mathematics is quite complex, it is often difficult to remember the logic when on my feet. For this, I use the students to help me.

Because I am thinking as I am teaching, the pace is just right for the students to think along with me. When I get stuck or make a mistake, the students are always more than happy to point out my error or to help me out. This approach is difficult on me, because when I screw up, sometimes it may take a while for the students or me to notice. The longer it takes to find the error, the more painful is the process of getting back on course.

I believe that the students are learning far more than in the days when I meticulously followed my notes. However, I invariably get a sprinkling of comments that I tended to make mistakes or that I was sloppy (or that I was unprepared!). In the past, I would have cringed at this criticism, but I am now confident enough in my methods to discount them.

Recently, a colleague passed along an interesting essay about class evaluations that appeared in his opinion piece in the New York Times titled Deep in the Heart of Texas, by Stanly Fish. The author writes:

"But sometimes (although not always) effective teaching involves the deliberate inducing of confusion, the withholding of clarity, the refusal to provide answers; sometimes a class or an entire semester is spent being taken down various garden paths leading to dead ends that require inquiry to begin all over again, with the same discombobulating result..."

I was intrigued by this analysis, not only because I was coincidentally musing about the same topic, but it got me thinking more about the corporatization of academia. How does one balance the need to assess teaching with the reality that the seeds of our labor do not take root until many years after graduation?

In a perfect world, the academic would be an expert teacher who understands how teaching techniques today lead to the desired outcomes years or decades later. Since the benefits of a good education appreciate over time, no single teacher can live to fully observe the fruits of their labor. Such information is necessarily accumulated over the generations.

Societies with highly-educated individuals tend to have the highest quality of life characterized by low crime rates, material comforts, good health, vigorous economies and functional institutions. As such, how does one ensure great teaching and efficient learning when it is so difficult to judge a teacher based on direct observation of the results?

Students who enjoy learning will find ways to succeed even when the mode of instruction is sub-optimal. If prosperity is to flourish in the general population, we must think of ways to educate the large number of students who see education solely as a route to increasing earning potential. In my perhaps biased sample, I find this large cohort's approach to education as would a patient undergoing an unpleasant medical procedure, thinking "I want to get through this process with the minimum pain."

Such a student views the diploma as a membership card entitling the holder to a well-paying job. From the perspective of the employer, the diploma is evidence of an education, whose quality is commensurate with the reputation of the granting institution. As educators, it is our responsibility to stand behind our product.

To understand good teaching, it is important to view it in light of the mission of the institution. The goal of a university is to produce individuals who have demonstrated the ability to think when confronted with novelty, being able to analyze and assess complex issues, and devise thoughtful long-term solutions without rigidly adhering to preconceived notions. These traits are necessarily learned within some context, which is provided by the student's major program of study.

I would thus associate good teaching with an atmosphere of continual challenge. Learning should be uncomfortable. Challenging problems do not have cook-book solutions. One must try new approaches that often lead to dead ends. Students often complain when they are not given recipes for getting from point A to point B. Some yearn for facts and feel that any amount of confusion, however small, is an indicator of bad teaching.

I will not attempt to answer the difficult question regarding ways of insuring good teaching. It is far easier to identify a perverse reward system, such as the mean numerical score of student evaluations as the basis of faculty compensation. If we want our students to think outside of the box, we are sending the wrong message by taking seriously the results of simple-minded surveys whose design is the antithesis of the skills that we endeavor to teach. It is reasonable for Starbucks to take seriously my displeasure with getting a cold cup of coffee; but, if ensuring the long-term efficacy of our educationally system is the goal, longer-term metrics based on thoughtful study need to be developed and implemented.

Tuesday, June 29, 2010

Rejection Still Hurts

Twenty years of experience with rejection makes it no easier. Funding brings immediate optimism in anticipation of the exciting work to come. On the flip side is the dreaded email that coldly states, "Panel Recommendation: The proposal was placed in the Do Not Recommend [DNR] category by the panel. " I avoid the gloom of rejection by filing away the reviews for several months, allowing enough time to pass for me to give them an objective read. Now that the sting of rejection has subsided, I am ready to share the reviewers' comments of a proposal that was rejected last year.

Two years ago, I felt glib satisfaction when NSF began soliciting "highly novel" proposals that showed potential for truly transformative breakthroughs. I was filled with certainty that the reviewers would see the brilliance of my ideas. The basis of my proposal was simple. In analogy to a transistor, which controls the flow of electrical current, the materials that we are studying can control the flow of light. They also act as sensors and can change shape in response to light. But the coup des gras is that these materials can be made into devices that can be integrated together into big morphing blobs with incredible intelligence (click here for a tutorial on photomechanical effects and smart materials).

We had built the an optical logic circuit - equivalent to several transistors worth of computing power, but with all the extra functionality, and all in a single device. We argued that in analogy to electronics, if we could demonstrate the equivalent of an integrated circuit (we started small, proposing to connect two such devices together), then the potential technological impacts would be staggering. Below is the summary of the three reviewers:

Summary Statement of Reviewer 1

The objective is to make a novel new material that has the ability to morph in response to stress or light. The PI aims to use this bifurcation component for mimicking a neural network. Although there are some concerns including size, scalability, speed and power efficiency, this might be a good test bed for studying unit components for a neural network.

Summary Statement of Reviewer 2

This is an excellent proposal with an interesting novel idea that can lead to a significant impact in a wide range of fields/applications. Also, the theoretical and experimental studies around this subject are broad enough to constitute a new field. I consider this proposal a highly transformative work.

Summary Statement of Reviewer 3

This is a visionary proposal with concrete short term goals. It could lay the foundation of a transformational shift in thinking about optical "materials." It is a refreshingly novel topic coupled with strong collaborations and interesting educational and outreach efforts. However, a more succinct background section combined with more extensive description of the experimental details would have made the proposal stronger.

Just based on the summaries, I would have thought that my proposal would be funded. There were no errors in my way of thinking, the worked seemed promising, and would even open up a new field.

To summarize the reviewer's comments (based on the full reviews and as alluded to in the summaries), the proposal was very good. The two criticisms were that (1) I did not provide extensive details of the experiments and (2) there were some concerns about how good this technology would be down the road. In essence, it would be akin to telling the inventors of the transistor that they had to anticipate all of the potential problems in building integrated circuits, and, that they did not give enough details about how they would design technologies that were decades away.

The National Science Foundation has an interest in supporting science to nurture new discoveries that are intrinsically interesting or that lead to new technologies - two criteria that my proposal met. Even my educational plan was considered innovative. However, to be fair, I understand that there are many more proposals submitted than can be funded, and not all good work can be supported. This is a fact of life that all scientists accept.

To end this post on a happier note, I quote the panel summary of a proposal of mine that was funded on the topic of theoretical studies of fundamental issues of light-matter interactions:

"This proposal addresses important and fundamental issue of optimizing nonlinear optical response of optical materials to achieve the highest figures of merit by performing modeling of 2nd, 3d and higher order non-linear susceptibilities. If successful, the theory/modeling will guide the materials synthesis in developing new optical materials with large nonlinearities increased by a factor of thirty, thus opening up interesting and important applications, e.g. in the area of cancer diagnostics and treatment. The panel members unanimously expressed support of this proposal as high payoff transformative direction of research."

Now this is a review that I don't mind reading repeatedly! But instead, my focus is on the steady progress that we are making over the last 2 years since the project was funded. In the end, a few declined proposals doesn't diminish the great satisfaction of doing interesting research.

Sunday, June 27, 2010

There is No Shortage of Scientists

In her article on, "The Real Science Gap," writer Beryl Lieff Benderly points out that there is no shortage of scientists. Rather, there is a shortage of high quality jobs:

It’s not insufficient schooling or a shortage of scientists. It’s a lack of job opportunities. Americans need the reasonable hope that spending their youth preparing to do science will provide a satisfactory career.

She goes on to describe what every Ph.D. student learns firsthand upon earning his or her degree. Full-time job openings in physics and related areas are not the norm. According to a recent report by the American Institute of Physics, "About 60% of the new PhD's in the classes of 2005 and 2006 accepted postdocs after receiving their degree..."

"...An academic position at a college or university remains the prevailing long-term employment outcome to which most new physics PhDs aspire. The majority (76%) of individuals who had a long-term goal to work in a college or university position accepted a postdoctoral appointment after receiving their degree. A postdoc is typically expected as a necessary step to obtain such a position. New PhDs who aspired to a career within the civilian government or at national labs also had a high proportion (77%) accepting postdocs upon completing their degree"

Having experience in both academics and industry, I have found that students who aspire to work in academics or the national labs are lured by intellectual stimulation. This is not to say that industrial physicists are dolts. In my 5-year tenure at Bell Labs as a member of technical staff (1985-1990), I interacted with many very bright and dynamic individuals.

Back then, Bell Labs was a huge operation distributed over 22 geographic locations, employing about 20,000 individuals - a minority of which were researchers (many of them were office staff, accountants, etc.). That still left a large number of researchers that were doing varied jobs. Some labs were chartered to develop a particular product, and thus focused on a scientifically narrow goal. Other labs, like the one in which I worked, gave its scientists the best facilities and the freedom to pursue science. Some of the science was focused on areas that could potentially make a profit (my work), whereas other labs had the luxury of working on esoteric topics. Bell Labs used sexy research to enhance its public image, and to attract the best and brightest.

Ninety percent of the researchers that I considered top-notch scientists ended up in academics. The others, who had an interest in developing products and serving humanity through technological advances remained in industry, some of whom are high-level managers at large high-tech companies whose products we all know and use.

People that remain on the academic track are willing to spend six to eight years getting a Ph.D. degree followed by about 5 years of relatively low-paying post doc positions in hope of getting the rare assistant professor position, after which it takes an additional 6 years to maybe get tenure and lifelong job security.

I was willing to jump through the hoops, not because I wanted a guaranteed income for the rest of my life, but because I wanted to have the guarantee of perpetual intellectual stimulation and the resources for making new discoveries. Teaching and research provide the kind of high that makes the whole crazy process worth it. Those who fail may disagree.

This brings up the winner-takes-all system of incentives that seems to be more common in the U.S. than elsewhere. In many European countries, the typical high school student is already pigeon-holed into a career path. In contrast, the U.S. system is a war of attrition. The best undergraduates get into grad school. The best Ph.D. students get the top-notch post doc positions, the best of whom get faculty positions.

The academic path is similar to the entrepreneurial path. Many individuals work for years, some for decades, motivated by the small chance of making millions of dollars. And money is not always the motivation. Many entrepreneurs delight in the idea that their product might save lives or make life more meaningful.

Independent of the motivation or the activity, the common thread is that many people are working very hard over long periods of time with little compensation in hope of reaping the benefits of their labor. The question I find most interesting: is this system the most efficient in providing society with the most benefit?

It is difficult to answer this question, but let's consider the narrower topic of the merits of rewarding a small cadre of winners. Conservatives would say that entrepreneurs deserve the money they make because they earned it. Liberals might counter that no-one deserves such a high monetary payout.

These polar-opposite responses miss what I believe to be the important point. Innovation in science, commerce, the arts, or any other worthwhile activity requires not only lots of hard work, but all sorts of crazy ideas need to be tried and culled in a process similar to Darwinian evolution.

Recall that evolution requires reproduction, mutations/variations, and selection. In a free market, good ideas or products propagate between human hosts, wacky ideas or products are the mutations, and the selection process resides in the willingness of the consumers to shell out valuable resources in exchange for the new product. The free market system encourages "mutations" by promising huge rewards to the winners. It is this promise that fuels innovation. The winners do not deserve their rewards based on the intrinsic value of their contribution (whatever that means), their hard work, or their intelligence. Rather, the promise of rewards drives the players to participate. Limiting the steep reward structure would limit the needed fluctuations that send tendrils of innovation into uncharted territory.

The same may be true in science. If resources are not sufficient to grant tenure to all the brightest people with Ph.D.s, perhaps the dangling of the coveted tenured position as a reward motivates scientists to differentiate themselves. Consider Alan Guth's inflationary model of the universe, which he reported at the end of a decade-long string of temporary post doc positions. In addition to getting him a faculty position at MIT and several prestigious awards, his work has been extremely influential in cosmology and forms the foundations of a theory of the early universe. There are undoubtedly tens of physicists who fail for every Alan Guth who succeeds.

It may appear heartless to curse bright individuals with advanced degrees to years of temporary positions and unemployment. However, such individuals have the skills to seek alternative employment opportunities in engineering, finance, industry labs, and start-up companies that offer potentially lucrative careers. It's a choice left to the individual.

So, while it may not be possible to prove this hypothesis, I posit that perhaps our form of capitalism attracts risk takers that thrive in a winner-takes-all competition, leading to the fluctuations that are required for innovation.

The mantra that we need to "train more scientists for the future" has been around at least since I was an undergraduate in the 1970s. I have yet to see huge shortages of talent. When domestic talent dries up, the foreign pipeline gushes open. On the other side of the issue, Benderly bemoans the lack of job opportunities and the need to provide more meaningful employment to attract the brightest students. Both extreme positions point to the same conclusion. The supply of PhD scientists seeking academic positions exceeds demand.

My essay may appear to be the self-serving pontifications of a tenured academic who does not care about all the poor souls who have lost the academic marathon. On the contrary, I write this as would a father informing his son about the realities of life. I would counsel my son to follow his heart based on full knowledge of the facts, and to work with full intensity knowing the odds. In fact, I have a son who is a Ph.D. student in Physics, and this is the advice I continue to give to him.

Thursday, June 24, 2010

Technology Can be a Pain

While my research is often funded because of its potential for high-tech applications, I sometimes find technology to be a pain in the neck. I pre-ordered an iPhone 4GS, so it arrived yesterday. I immediately drove to my local AT&T Store to get my contact list transferred to my new iPhone; but alas, because the official release date was not until today, they did not have the right equipment to make the transfer. The AT&T Store also did not have any iP4accessories.

New technology is such a time sink. I spent hours loading apps and getting to know my new time-saving device. It is certainly a wonderful piece of technology, and trying out all the new features and apps is very addictive. But, at the end of the day, I felt that I had wasted a large chunk of time that could have been put to more productive uses. Hopefully, this phone will be a time saver in the future.

I was also amongst the first group of Geeks to get the Nook in December 2009. Being a new product, I knew it would have all sorts of kinks, but I liked the fact that it used Google's Android operating system, the files were not stored in a proprietary format (as is the case with the Kindle) and that oen could lend a book. There certainly were lots of kinks, but after a couple of software updates, the Nook became a wonderful eReader. Until yesterday...

A couple of days ago, I got an email that announced release 1.4. So, I turned on the WiFi on my Nook, and started the update process, which turned out to be an endless loop of downloading the update, followed by rebooting. Update 1.4 never successfully loaded. After several hours of effort, some very helpful people at my local Barnes and Noble Store identified the problem, and graciously replaced my Nook with a new one. At least my wife and I got to have coffee and browse through lots of books. However, B&N made some money on the deal because we couldn't resist purchasing a book.

To my great irritation, my new Nook would not charge; and, it would not connect to my secure WiFi at home. I took my Nook back to the store and dropped it off so that they could troubleshoot the problem. They were able to get the Nook to charge by doing a hard reboot. Of course, I should have done it, but I was sick and tired of dealing with technology. After returning my Nook to me, they told me that I would need to call customer support if I could not connect to my secure network. Now I am looking forward to someone reading from a script, telling me that my WiFi router is not compatible with the Nook, which I know to be false because my previous WiFi did connect just fine. Perhaps I am getting too cynical.

This afternoon, I am meeting with my students to talk about research. Our work may eventually lead to new technologies that will frustrate the next generation of Geeks. At least I can look forward to thinking about physics and enjoying the intellectual stimulation before being sidetracked by the next time sink.

Wednesday, June 23, 2010

A Typical Summer Day

As I wrote in my previous entry, I was spending a large amount of time writing a proposal, updating a progress report to NSF, and to top it off, I was notified that an NSF proposal was declined. The NSF news was particularly annoying because my proposal was a resubmission of a previously-declined proposal that I had revised according to the panel's recommendations. It seems that they always find new reasons for rejection - often contradicting the previous panel's suggestions.

But, making my rounds in the lab yesterday afternoon with my students reminded me about the great pleasures of research. From the time I arrived, my labs were buzzing with excitement (at least for me!). One student was designing an oven chamber that will allow his experiments to shed more light on the role of a polymer in reversing the arrow of time. A new student had just finished an analysis of an experiment that was able to literally watch the process of decay and self-healing. If my proposal gets funded, we will be able to extend these imaging studies to get a full spectrum at each pixel. While the results of these experiments are already hinting at some new and wonderful physics, future experiments will give us the data we need to build a deeper understanding of the process.

Another student was busily working on fixing a laser, which we will use to test our theories of the fundamental limits of light-matter interactions, while another pair of students are working on new theories that may lead to the manipulation of the quantum properties of materials by controlling the underlying nano-structures.

To me, teaching and research are strongly intertwined. Students in my lab are learning to do research under my supervision, I learn new things in the process of doing research, and my colleagues and I learn from each other. This past semester, two projects I assigned in class will lead to publications. So in addition to learning about new topics, the students ended up producing new knowledge and an extra line on their resumes. Research is just a more advanced method of learning in the process of making new discoveries.

When I got home, I reluctantly returned to my computer to make the finishing touches on all the boring parts of my proposal: the budget, price quotes, producing justification for not charging overhead on items purchased to build new equipment, etc. I finished these tasks by about 10:20pm - just in time to do some light reading before going to sleep.

I then noticed the statistical mechanics book on my desk, and could not resist the temptation to browse a bit. I got sucked into the section on non-interacting particles, which lead to discussions of entropy, gamma functions and their use in determining the volume of an N-dimensional sphere. I never cease to be amazed at the ingenuity of the human mind, and how very general results can be squeezed out of some simple ideas, a shake of logic, and a smidgen of mathematics. My 10 minute perusal turned into an hour and a half journey. When I finally made it to bed, I was so worked up that a half hour of light reading was not enough to unwind. I tossed and tumbled for a while before finally falling asleep.

This morning, I woke up a bit tired, but with a general optimism that comes from my forays into physics. I just finalized the proposal, and submitted it, taking a 20 minute break to write this post. Now, back to the rat race...

Monday, June 21, 2010

The excitement of teaching a new course

I spent this past weekend playing ice hockey in a tournament in Wenatchee, Washington. Most people derive pleasure from activities in which they excel, but I love playing ice hockey even though I am not very good. In fact, I was the worst player on my team; but, I got to participate because I organized the team.

There is nothing more exhilarating than the feeling of the cool breeze rushing past my body as I am skating with full force to the puck. To the casual observer, I am sluggish and clumsy, but from my perspective, I glide gracefully through the air. My favorite place is in the eye of the storm, right in front of the opponents' net, with bodies flying and disembodied voices yelling in warning. The melee around me is hypnotic, making me feel serene and at peace. Some people meditate or do yoga to relieve stress, but for me, hockey is the best therapy. After a weekend of ice hockey, I am ready for anything, even travel - which I despise.

The ultimate complement to a scientist is to be invited to give a talk at another institution or a conference. Nothing beats talking about ones work, especially in front of an audience of eager listeners. But more importantly, meeting with other scientists is the ideal forum for exchanging ideas and regenerating the creative spirit. On the flip side, getting to a meeting requires travel, which I find extremely distressing. Weeks before an overseas trip, my early morning hours are fraught with intense anxiety, preventing sleep, followed by less-anxious daytime, but under a cloud of fatigue that prevents me from working efficiently. This snow-balling cycle of anxiety and exhaustion takes its physical toll.

My travels last year were particularly harrowing, but meeting with old colleagues and friends in beautiful settings to talk about life and science yielded great satisfaction. I spent a large chunk of the summer of 2009 in Belgium, collaborating with colleagues and giving talks. I started the summer by giving an invited talk in Brussels on "A Hierarchical Approach to Making a New Class of Ultra-Smart Morphing Materials,” at an international conference, New Molecular Materials for Advanced Optical Applications in a Changing World, which was sponsored by the Belgian Academy of Sciences.

Then I made my way to the 9th Mediterranean Topical Meeting on Novel Optical Materials and Applications in Cetraro, Italy, where I presented an invited talk on “Using liquid crystal elastomers to transmit and receive a force on a beam of light,” Returning back to Belgium, I gave a seminar at the University of Leuven, in a special series of talks called the INPAC Lectures on Trends in Nanosciences. I finished my tour in Europe by giving the opening keynote address on the topic “A birds-eye view of nonlinear-optical processes: unification through scale invariance,” at the International Symposium on Materials and Devices for Nonlinear Optics, ISOPL’5, on the scenic island of Ile de Porquerolles, France.

I planned to attend the whole meeting in France, arriving a couple hours before the start of the meeting and departing on the last day. A few days before the meeting (after I had booked my reservations), I learned that my talk would be the opening keynote address. The keynote address is typically much longer than an invited talk, so I expended some effort to to fill the additional time with what I considered to be interesting topics. I also tried to change my flight to arrive earlier in the day, but all the flights were full. So, I spent a couple of stressful hours making contingency plans in the event that the flight was late or the traffic prevented us from getting to the docks in time to catch the hourly boat to the island.

As is typical with travel, all the pieces did not fall into place. Our plane arrived at the Toulon Airport less than two hours before my talk. After waiting for over half an hour for our luggage, we took a 20 minute cab ride to the docks, only to have missed our boat. The next one was due to leave in half an hour, so we were forced to rent a water taxi that seats about 30. The captain, first mate, and her dog got my wife and me to the island in no time. More importantly, the James-Bond-like high-speed ride and the beautiful scenery had a wonderfully calming effect. From the docks, it was about a 1 km walk to the conference site (no cars are permitted on the island). I made it with 10 minutes to spare!

My summer ended with less excitement in San Diego, where I gave two invited talks at the SPIE meeting, chaired two sessions and was on two program committees. While in Pullman, I hosted three collaborators from Belgium: Prof. Koen Clays, Dr. Javier Perez-Moreno (a 2007 recipient of the joint Ph.D. degree between WSU and the University of Leuven, who is now on a postdoctoral fellowship in Leuven), and Inge Asselberghs. Several highly visible papers have resulted from this collaboration.

In September, I headed to Australia, China, and Chile to give invited talks on “Smart Morphing Materials”, “Using New Theories to Understand Light-Matter Interactions to Optimize Materials for Next Generation Technologies,” and “The Role of Polymers in Reversing the Arrow of Time.” My fall travel ended in October with a trip to Ohio where I gave a condensed matter seminar at Case Western Reserve University in Cleveland and a colloquium at Kent State University

This six-month period of travel, while professionally rewarding, was physically exhausting. Almost a year after my marathon travels, I have yet to regain my equilibrium. This past May, I gave an invited talk at a meeting in France that was dedicated to the very distinguished Professor Joseph Zyss on the occasion of his 60th birth year. This interdisciplinary meeting was first rate, the attendees were all top notch, and the social activities elegant in the extreme.

Recovering from my trip to France and perseverating over my pending trips to San Diego and Budapest in July/August, the hockey tournament provided me with a wonderful break.

Ironically, even the tournament was a source of stress. Upon my return from France, I got a text message from a player asking me about the team roster. Only then did I recall that I had agreed to organize a team. I replied that I was working on a roster, but didn't mention that I had lined up only 3 players and a goalie; we needed at least 7 more players. Being a coed tournament, I needed to get a critical mass of women, so I sent out lots of emails. Unfortunately, one of the competing teams from our area had taken all the best players. Eventually, I formed a strong team, but short on superstars.

To our great satisfaction, we won all three games in the first round, even beating our stacked sister team, only to loose in the finals, placing us 2nd out of the 6 participating teams. That same weekend was our wedding anniversary, so my wife and I enjoyed the dining opportunities in the area.

So what does this long-winded entry have to do with teaching? I'll answer that question shortly. The premise of this blog is that thinking about physics and learning about the universe brings greater satisfaction to life than fame and fortune.

At this moment, I am in the middle of writing a proposal - a task that I find quite unappetizing. My docket is filled with papers to edit, manuscripts to write, writing assignments to correct, and budgets to balance. Extreme drudgery. But, as I sit at my my computer, I have noticed two fresh new books on my desk: one on Classical Mechanics and one on Statistical Mechanics. These will be the required textbooks that I will use in two graduate courses that I am slated to teach for the first time this fall and spring semester.

Preparing a new course is a herculean task. It takes me countless hours to solve all of the problems and to decide which ones are pedagogically suitable as assignments. At times, it may take hours or days for the concept on a single page to soak in. And even more work is required to make the material clear to the students. I find the process delightful.

As I thumb through the 500+ pages of each book, I thirst for the knowledge that they offer, and can't wait to immerse myself in the serene state of learning; and then returning to the real word to share my knowledge with others. The process of lecturing and interacting with students brings new insights, and often leads me into new research directions.

Teaching is an activity from which I derive the greatest satisfaction. The process of teaching more than compensates for all the day-to-day frustrations and drudgery. In what other venue can one ponder the deepest questions in a dialogue with others that share in the passion for understanding? When I emerge from 50-minutes of lecturing, all of my worldly concerns seem petty and trivial.

Friday, June 18, 2010

Charges Rule!

While gravity is the universal glue that holds together the stars, prevents us from flying off the surface of the Earth, and is the stuff that makes space-time itself; from the perspective of our everyday lives, electromagnetism is the most fundamental force that profoundly affects us. At the heart of everything we are and do are the interactions of electromagnetic fields and charges. Indeed, matter is merely a bag of charges that are held together predominantly by electric fields.

Connoisseurs of the strong force may argue that theirs is the most important since it binds together the neutrons and protons that form the tiny nuclei at the center of every atom; but, alas, the swarm of electrons that buzz around the nuclei shield them from view. We can be thankful that nuclei exist as anchors to the frenetic electrons, but, physicists would agree that our personal interactions with the universe are governed by the affects of electromagnetic fields on electrons.

When our eyes see, its because light - made of wiggling electric and magnetic fields - causes charges to slosh around in our optic nerves, eventually leading to an electrical signal that finds its way to our brains. Similarly, charges communicate with each other by shouting not words, but bursts of light. It's the coordinated symphony of these light bursts that choreograph the dancing collection of charges.

The group of cells that constitute a brain interact with each other through electrical signals. But the animal body is also the host to many chemical reactions and thermodynamic processes. One might therefore defend chemistry as the true seat of life, with charges as essential albeit second-class citizens. But what is chemistry? You've guessed it! Chemistry is no more than interactions between charges.

Quantum Mechanics, as quantified by the Schrodinger Equation, imposes the rule of law that must be obeyed by all particles such as electrons and atoms (as long as they are not moving at nearly the speed of light). The social gathering of atoms form communities called molecules. While there is much shouting and motion within the community - all of which obey the grammar of quantum mechanics, the molecule speaks with one voice. All of these complex interactions within the group leads to an entity that behaves in a well defined way due to the very simple rule that like charges repel and opposite charges attract, and, that the strength of this force gets larger as the charges get closer together. This relationship is called Coulomb's Law. All of chemistry can be explained by applying the crank of the Schrodinger Equation to the tune of Coulomb's Law.

By understanding the social rules of the electrons and nuclei, the complex behavior of the molecular community can be predicted. The rules are communicated through a mathematical language via the Schrodinger Equation. Being a differential equation, it can only be solved exactly for the simplest cases. One of the triumph's of early twentieth-century Physics, a time of explosive discovery the likes of which has yet to be repeated, was the precise prediction of the colors and intensities of light emitted by the scream of an electron in a hydrogen atom when it de-excites after being jarred by a pulse of light or an electric shock.

To go beyond this simplest class of atoms, made of one electron and one nucleus, requires that the Schrodinger Equation be solved using approximation techniques or high-speed computers. Such approaches lead to accurate predictions of more complex systems and helps us to understand the concept of chemical bonds, which connect atoms. Indeed, the order observed in the periodic table of elements, as first arranged by the Russian chemist Dmitry Ivanovich Mendeleyev in the mid to late 1800s, can be understood by the simple concept of interactions between the social charges.

As the complexity of a molecule increases, it becomes more difficult to predict its properties using Quantum Mechanics. It's not that Quantum Mechanics is in any way breaking down, but rather that computer memories are too small and microprocessors too slow to get an accurate answer in a reasonable amount of time.

We get around these issues by extrapolating what we understand for smaller systems. For example, chemists make stick figures of larger molecules by using qualitative rules for making bonds between atoms. This approach allows them to approximately predict the properties of more complicated materials. To put this in the language of Physics, chemistry is a compilation of the myriad number of ways that charges interact with each other. While such empirically-determined rules are not hard and fast (there are always exceptions), this approach has been extremely profitable in designing and synthesizing new molecules, through a process of trial and error, that are useful for a given application.

To illustrate how all this relates to everyday experiences, consider physically touching someone. We might conclude that the force of contact between two objects is a new type of force, and we would be in good company. Even Isaac Newton considered the force of contact distinct from others; but it is not. Newton could be excused for his ignorance of electricity and magnetism, which would not be fully understood until Maxwell came on the seen in the eighteen hundreds.

When the fingers are brought together, the negatively charged electrons repel, leaving the positively charged nuclei behind. As we push harder, the protons get closer together and the shouting match of light bursts swells, yielding greater repulsion. Even when pressing your fingers together with gut-wrenching vigor, a small gap remains. The conclusion is that we can never touch anything. The sensation of touch originates in the electric repulsion between charges.

On one occasion as a young adult, I was annoying my parents - as was my habit, by picking the M\&Ms off the cookies. In response to my persistent disobedience, my mother yelled in anger,``Don't touch the cookies!" In response, I calmly explained to her the Physics underlying her misconception about the physical reality of touch. My smug satisfaction was abruptly interrupted by swat across the head and my indignant protest was returned with a devious smile, ``I didn't touch you either."

So, when you kick a football, kiss your sweetie, hit a nail with a hammer, slide on a patch of ice, use your computer, or float in the heavens on a hang glider, you are relying on the interactions between charges. You would be hard-pressed to find an everyday experience that did not originate in the flicker of light between the charges that form matter.

We might be faulted for ignoring the humblest force of the four, the weak interaction, which is responsible for encounters between ghostly neutrinos and matter. Neutrinos are emitted when a neutron decays into a proton and an electron (more accurately, anti-neutrinos are emitted) and are produced in copious amounts in supernovas. A light year of lead is required to stop one of these tiny sprites. Interestingly, the universe is filled with neutrinos. If we could take a snapshot with a magic camera that stops neutrinos in their tracks, we would find that any volume around us the size of a sugar cube would contain about 1000 neutrinos. But, while they are pervasive and travel near the speed of light, they interact so weakly with matter that over a typical human lifetime, our body captures about one neutrino from this large surrounding sea. We can therefore ignore the affect of the weak interaction on our daily lives. That we are made from stuff that formed in supernova explosions is another matter. It happened so long ago!

All processes in nature can be reduced to the four forces (gravitational, weak, strong, and electromagnetic). The idea that the complexity of the universe is governed by the action of four simple rules is both compact and elegant. While gravity may woo us with its beautiful geometric description of space and time, and the strong force may capture our imagination of how the smallest particles (the quarks) are held together and explains the power source of the sun, life and our existence today is for the most part dominated by electromagnetism.

As a graduate student, I was tempted by and flirted with all four of the forces of nature, but by a fluke of fate ended up concentrating on the force that makes us tick; the most practical of the fundamental forces that governs interactions between charges and electromagnetic waves. Light is a specific case of an electromagnetic wave that occupies the visible part of the spectrum. And this is precisely what makes light so special; we can see it.

Everything that we can know about the interaction between light and matter originates in the interactions between photons, the tiny particles of light, and charges. The interaction of light with a molecule is quantified by a function called the polarizability and the interaction between a beam of light and a collection of molecules is called the susceptibility. As such, these special quantities we call the polarizability and the susceptibility provide us with the code to decipher all of the richness of electromagnetic phenomena in materials that we take for granted. We will need to consider these quantities carefully if we are to understand how we interact with the universe. More pragmatically, understanding what makes the polarizability tick empowers us to use these interactions for the benefit of humanity. That is why I study nonlinear optics.

Thursday, June 17, 2010

An introduction

My Motivation


Why do I love Physics? Because in no other intellectual pursuit is such a small investment rewarded on such a grand scale. The mind's interpretation of experiments here and now enables us to understand everything from the very smallest thread of space to the vastness of the universe this moment and throughout eternity.


I am ambivalent to my late-night realization in 2005 that I will never be a famous Physicist. Actively pursuing answers to the deep questions brings a great satisfaction to my life, and the excitement of the process is difficult to extinguish, even long after bedtime. None of this will come as a surprise to the hordes of my happy but anonymous compatriots. To the rest of you, I hope that my story will explain how physics and the pursuit of understanding brings meaning to life; and, it's so much darn fun.


About Me


I have always enjoyed science, and more broadly, reason. By shedding superstition and emotionally-based sloppy thinking, humans have made great progress in understanding the universe, from the fabric of spacetime, to clusters of galaxies, and everything in between, including life. Through an understanding of human evolution, we can appreciate our special place in the universe and marvel at the brain's ability to think. In short, the scientific method allows us to know what was thought unknowable only a few decades ago.


I get obsessed with my interests. I decided to deal with the stresses of being a young faculty member by setting up a fish tank with my daughter. In just a few months, this relaxing hobby turned our basement into a tropical fish breeding facility with a dozen large tanks. Sales of my fish to a local pet store accelerated my acquisition of equipment until the stresses of taking care of thousands of living creatures forced me to sell all of my tanks and take up astro-photography, a more relaxing hobby (or so I thought). I compulsively bought telescopes and equipment until I was forced to build a small room off of our garage to store everything, including a computer system on a desk with wheels.


At other times in my life, I read science fiction, amassing a nice collection of novels, whose colorful dust jackets decorate the shelves in our family room. Yearning for facts, I dropped my passion for fiction to read about history, religion, science, political philosophy, I.Q., etc. To satisfy my appetite for physical activity, I play ice hockey as often as possible.


Yet, I spend most of my time thinking about Physics. No other activity is more satisfying than understanding how things work or more spiritual than pondering the most basic questions. My career spans highly applied research - such as co-developing a process for making polymer optical fibers (resulting in a start-up company), through very esoteric basic physics - using both experimental and theoretical techniques to build an understanding of universal properties of light-matter interactions.