Tuesday, March 27, 2012

After almost a year of waiting, finally some good news!

These are tough times for any programs that are funded by the U.S. government. Last summer I was contacted by NSF for some clarifications on my proposal, which I provided and resulted in the program manager's approval. In my experience, official notification to our university about an award comes no more than a few weeks after this first contact. In this case, we heard nothing for months.

While on an NSF panel in the fall semester, I visited with the program manager to inquire about the disposition of my proposal. He told me that the paperwork had been signed and that the documents "were on the Directors desk" awaiting final signatures. However, congressional battles on funding to NSF put proposals on hold until a resolution was in sight.

I was getting concerned that this proposal would never be funded. However, a couple weeks ago I was contacted by an accountant with questions about a budget issue. It took only a few emails to resolve the issue, so again, I waited. While I have not received official word in the form of a binding award letter, the NSF website now shows my proposal as "awarded." The reviews are also posted.

Excerpts from the panel summary follow.

Objective: The objective of this program is to invent new approaches for manipulating quantum systems in a way that enhances their nonlinear-optical response.

Intellectual Merit:
The intellectual merit of this proposal is very high and may lead to discovery of universal properties that will enable optimization of materials for a given task, such as nonlinear optical response. The PI's approach is to use sum rules in conjunction with numerical optimization and Monte Carlo studies to broadly understand those issues that are most important in making an optimized material. This will include development of fundamental quantum mechanical concepts that build an understanding of the performance limit of optical materials and practical methods for attaining the limit. The knowledge will guide chemists and nanotechnologists in designing new materials and nanostrucutures. The PI is very well qualified to carry out the proposed research. If successful, the proposed effort will lead to transformative paradigms enabling physicists and materials scientists to synthesize novel functional materials and to generate novel photonic devices.

Broader Impacts:
This proposal has the potential to make transformative advancements in the development of novel materials for photonic applications. The fundamental knowledge will serve as a guide in optimizing a material for a given task, such as nonlinear optical response. The work applies to any system based on the interaction between light and matter including molecules, inorganic materials, nanoparticles, smart materials, nanowires, etc. Education and outreach efforts are very strong and will broaden participation of Native American and under-represented groups and undergraduate students in the research program. The PI also plans to promote undergraduate participation and interaction with high school students through online interactive resources. Numerical codes that will be developed as part of the program work will be distributed over the web so that students can participate in research.

Summary:
The panel considers this is an excellent proposal with strong technical and education components. If successful the research will have a transformative impact on development of fundamental quantum mechanical concepts to synthesize novel functional materials.
Reviewer Ratings: E, E, V,

Panel Recommendation:
The proposal was placed in the Highly Recommended [HR]] category by the panel. The Program Directors concur with the panel opinion as expressed in the panel summary with respect to both the Intellectual Merit and the Broader Impacts criteria.

Sunday, February 26, 2012

Extreme Physics


In our culture, the word "extreme" has taken on a new meaning because of its use in naming new sports that are dangerous. By "extreme physics," I mean the physics of a phenomena when one of its defining parameters is at an extrema; that is, a minimum, maximum, or point of inflection. Mathematically, an extrema of a function is defined as the point were the first derivative is zero. In many ways, extreme physics can be just as exciting as extreme sports.

It interesting to me that the underpinnings of physics are based on extremes. Surely I am not unique in thinking this way; but, I am excited by the topic because one of my projects is based on the theme of using the limits of the nonlinear-optical response to discover new things about light-matter interactions that in the end may lead to a deeper understanding.

As we have been digging deeper and deeper, new patterns are emerging. This regularity, however, is only observed at the extremes of the nonlinear-optical response. Given how all of known physics manifests itself by an extrema, it is pretty exciting to think that we may be on the verge of discovering truly new physics.

There are other projects that are going well and have potentially very exciting ramifications. For example, we are in the process of fine-tuning a new model of the self-healing process. But this is not just a model in the form of an equation that we use to fit our our data (which we are indeed doing), but the parameters represent new phenomena. If the model fits, we are potentially looking at new theory that may be generally applicable to many things. The more general the applicability of our work, the happier my mood.

I end here by lifting an excerpt from a review article we are writing for Physics Reports. It is a more detailed description of what I have written above.

The extremes of physics are characterized by unique behavior. For example, the second law of thermodynamics states that entropy cannot decrease in a closed system. The special case when entropy change is minimized (i.e. it remains unchanged) defines reversible thermodynamic processes. The maximum efficiency of a heat engine requires a reversible process. Calculations of reversible heat engine efficiencies led to the definition of entropy. While motivated by practical applications, entropy has become one of the most important fundamental concepts in physics.

Quantum mechanics is based on the fact that certain quantities cannot be simultaneously measured to arbitrary precision. To accommodate this observation, variables such as momentum and position are generalized to become operators that do not commute. The mathematical formalism naturally leads to the uncertainty principle, which states that there is a lower bound to the product of the position and momentum uncertainties. The fact that uncertainties are constrained by a lower bound is the basis of quantum mechanics, which describes a vast richness of new phenomena that is inexplicable using classical mechanics.

The principal of energy conservation originates from the more general concept of a Hamiltonian, which yields the equations of motion through a process of finding the extrema of the action. These ideas carry over into the quantum realm in the formulation of path integrals, which bring out the wave nature of matter. The absolute maximum speed limit defined by the speed of light, on the other, leads to non-absolute time, where observers in different coordinate systems view the same phenomena but from the perspective of a rotation in four-dimensional space-time. The marriage of relativity with quantum mechanics as embodied by the Dirac equation led to a natural way of accounting for the electron spin, and as a bonus unexpectedly predicted the existence of antimatter.

Clearly, the extremes are fertile soil from which the most fundamental concepts in physics grow. As we later show, the fact there there is a fundamental limit to the nonlinear-optical response of a quantum system defines an extreme that is characterized by several features. For example, while many states of a quantum system contribute to the nonlinear-optical response, at the upper bound only three states are found to contribute. This was originally postulated as a hypothesis and later confirmed to be true for many quantum systems, though it has not yet been rigorously proven. As such, it is referred to as the three-level ansatz.

We will show that systems with a nonlinear response near the fundamental limit share other properties. Why this is true is not yet understood; but, the fact that certain universal properties appear to be associated with the extremes of the nonlinear response hints at fundamental causes, perhaps grounded in new physics, which become apparent only under scaling rules that follow naturally from these limits.

Sunday, February 12, 2012

A sampling of everything that is possible


It is spine tingling to view a two-dimensional representation of everything that is possible given the laws of physics. I had previously written about our work, where Shoresh used Monte Carlo simulations to let the computer randomly sample all possible quantum systems that are compatible with the Schrodinger equation. In my recent report to the National Science Foundation, my awe in Physics and its ability to make such grand and sweeping statements was rekindled. Above is a summary of our work in graphical form. It is aesthetically pleasing in its visual appeal, but more so in its intellectual content.

I cannot explain in the small amount of space here the beauty of these results, nor what they mean in terms of the basic science. Nature is clearly hinting at some deep structure of the quantum world when we probe molecules with light. My summary to NSF states it best:

"The uniqueness of our work is that it provides a new way of thinking, which in the words of a Spotlight on Optics article describes our approach as "reminiscent of the cutting of the Gordian knot by Alexander the Great. (see article)" Rather than focusing on one application, our work provides a new paradigm for understanding the critical material properties that need to be controlled for making better materials for any application that is based upon the interaction of light with matter.

"Technologies that may benefit from this work include high-contrast medical imaging, laser-based cancer therapies that target malignant cells without damaging surrounding cells, all-optical computers that are ultrafast by virtue of new architectures enabled by light, high-speed telecommunications through technologies that eliminate the electronics bottleneck, and high-density reconfigurable optical information storage."

Now I need to get back to the painfully boring task of updating my CV and preparing my annual review materials. I can't wait to finish these inane tasks so that I can get back to thinking about the things that I find most meaningful in life.

Wednesday, December 28, 2011

Annual Review of Faculty

I will once again be doing annual reviews of our faculty. Here I share my views on the process; and, describe the metrics I use to make evaluations.

There are three general categories; (1) teaching, (2) research and (3) service to the university and the profession. The annual review is meant to give the faculty feedback about their performance for a particular year, and is one factor in determining raises. However, given budget cuts, there will be no raises in the foreseeable future.

While a faculty member may have a long and distinguished career, it is possible to have up and down years based on the simple fact that there are fluctuations in the metrics; one year a faculty member may have 10 excellent papers followed by a year with none. However, the annual reviews will not fluctuate as much as the metrics given that fluctuations are inevitable and usually not a sign of a problem.

Teaching is evaluated based on several pieces of information. The syllabus shows how the course is organized and faculty usually provide a narrative on new ideas that they explored to enhance learning outcomes as well as student feedback in the form of course evaluations.

I am strictly opposed to standardized teaching evaluation forms because they often lead to meaningless comparisons between professors in the form of numbers that each students assign using very different criteria. I marvel at the well-reasoned forms that each of our professors produce and the unique information that they solicit. Since each faculty member may be using different approaches, a single number assigned by a student does not tell a meaningful story. Thus, I usually read every comment made by each student - even in classes with 250 students. It takes lots of time, but provides quality information with which I can get a better sense of how well our courses are working.

As a case in point, several years ago, I scanned through a bunch of evaluations of a young professor who was teaching a large section with hundreds of students and noticed two things. The mean was slightly below average (i.e. the typical student choose average or below average when comparing this professor to others at our university) and the distribution was bimodal with a quarter of the students giving him the highest marks and three quarters giving him below average ratings.

I gained great insights from the correlations between the numbers and the narratives. The typical student who gave a low ranking said the course was "unfair" because it was too hard. The students giving him higher marks stated that the course was challenging but the professor spent lots of time explaining complex concepts to the students during class as well as during office hours. In addition, this professor scheduled non-mandatory help sessions so that students could ask questions and get extra help.

A picture emerged of a dedicated teacher who expected excellence but was willing to expend a great deal of additional effort to make sure that the students learned. While his average course evaluation was lower than many of the other faculty in our department, I judged him to be a more dedicated teacher than those whose evaluations were high and the comments uniformly positive about the fact that the class was "fun" and "easy."

Research is more difficult to judge. The fact that a funding source is willing to pay top dollar to a given faculty member is a good indication that this individual and his/her work is considered useful or interesting. Similarly, papers that make it through the peer review process at good journals have convinced an editor and a few reviewers of correctness and importance. If a piece of work gets many citations, then it shows that people are reading the papers and finding them useful. Thus, a blend of these factors provides a good indicator of research productivity and quality.

However, these indicators can have the opposite meaning. For example, a paper may be cited many times as an example of bad science. Or, an individual with lots of citations may be a technician who provides specialized samples to many research groups. While such a person is contributing to the science by providing samples, the number of citations may not be a sign that the work is particularly interesting or creative.

Lots of funding is not always a sign of good science. A company may grant big bucks to a researcher to test a trivial property of a product. In contrast, a small grant from the National Science Foundation for work by a single PI who is challenging our perceptions of the nature of space-time would carry much more weight in my mind.

I try to consider all these factors together when evaluating the research productivity of a faculty member, which includes learning a little bit about their research. I may look at the h-index and or the numbers of publications or citations per dollar spent, or other ratios to develop an impression of the research quality. Faculty members who have attained Fellow status in a professional society get such honors from significant lifetime contributions to their fields, so fellowship in a society factors into the annual review. In the end, I make a value judgement that may or may not be in line with what others may think. I accept the fact that the process is far from perfect.

Finally, there is service. Every faculty member is expected to contribute to the operations of the department and the university. Faculty members sit on tenure committees, thesis examination committees, search committees, and do all kinds of thankless tasks such as recruiting students, writing reports for the administration, and doing self studies that support our claims that we are a high quality department relative to our peers. I look for not only quantity of service, but the impact that it has on our department and university.

Service to the profession takes the form of membership on program committees for conferences, being on editorial boards of journals, reviewing papers for journals or proposals for funding agencies, organizing conferences, acting as an external examiner at other universities, and serving on panels that take advantage of an individual's scientific expertise. An active scientist does this kind of service as a matter of daily routine. I thus expect to see a substantial professional service component from each faculty member.

To make a final assessment, all of these factors are taken into account. A positive annual review requires significant accomplishments in at least two of the three major areas, with emphasis on research and education. What makes my job especially difficult is that our department is very strong. All faculty members are doing high-quality research, are well known in their fields, are well funded and are advising undergraduate/graduate students.

I have a few weeks break until I need to get around to this unpleasant task, which takes 5 full days to complete. For now, I am enjoying my break doing some new physics, which includes having completed an interesting calculation that was motivated by my proofing of one of Shoresh's manuscripts. Before the intensity of the new semester begins, I need to proof and submit a few more manuscripts for publication, as well as review a few more papers.

This past year has been a local maximum with a record 12 publications in refereed journals (almost 10% of my lifetime 123 publications), not to mention a bunch of invited and contributed talks. Now I will need to get back to work to maintain this momentum. But not until I take a day off to enjoy family, friends, and this wonderful place we call home.

Saturday, December 24, 2011

The amazing world of numbers

The other day, I was trying to get straight the words for the numbers in Italian. While our counting system maintains a strict pattern, the words representing them are irregular. The exceptions to the regularity got me thinking about how number systems developed from primitive times. As a good scientist, I will first propose my hypothesis without looking at the history books, though I admit having some knowledge on the topic.

Consider the necessity of counting to the Shepard while letting his flock out into the field and needing to know if any are missing at the end of the day. The Sheppard added a stone to a pile for each sheep going out into the field and then removed a stone for each returning one. If a stone remained, the Shepard knew that a sheep had gone missing. In addition to being a practical counting technique, this procedure established the one-two-one correspondence between two sets of objects with the same number of elements. From a physicists point of view, one could say that this also established the principle of conservation of stones and conservation of sheep. Their numbers did not change, they just moved about from one location to another.

To solve the issue with proliferating sheep and shortages of stones, the Shepard recognized that he could use a different type of stone to represent, let's say 10, sheep. Thus, after the ninth stone was placed on the pile, the tenth sheep would be represented by the single 10-sheep stone while removing the previous nine. An extra stone would again represent each additional sheep until the 20th, at which point the next 10-sheep stone was used. Later, it became apparent that one need not use a special stone representing 10 sheep. Rather, one could place a stone in a different spot. Thus evolved the base 10 system, with separate symbols, or numerals representing one to ten, with these same numerals representing tens when placed in the tens spot, etc.

The "base" 10 number system most likely originated because of the ready availability of 10 fingers. I find it interesting that the word for finger and number shares the word "digit."

Other bases are also possible. When we were in Italy, I was at first puzzled at the system of Roman numerals engraved into the ancient buildings, which differ from what we commonly refer to as roman numerals today. For example, the modern form IV was represented as IIII and what we would recognizes as IX was written as VIIII. This even carried over to larger numbers, such as CD which they wrote as CCCC. Clearly, the original form of the Roman numeral system seemed to suggest base 5, as one would expect from counting on the fingers of one hand.

Now back to Italian. The pattern from zero to nine is as one would expect for base ten, with unique names corresponding to each numeral. Above dieci (ten), the system becomes schizophrenic. Undici (for eleven) seems to be saying one and ten, but quindici (for fifteen) is irregular in the sense that it is not a compound form of dieci and cinque (five). After Sedici (sixteen), the pattern reverses to diciassette, diciotto, diciannove, etc. Interestingly, the naming pattern for 20 and above continues along a strict convention without exception. Happily, 50 is cinquanta not quinquanta as I would have expected given the expression for 15. This pattern suggests an original base 16, or perhaps 15 with origins in the Roman base 5 system, which later got fixed to be consistent with base 10 convention. Whatever the case, the words hint at a mixture of systems.

Words and grammar can carry secret messages from the past. Ideas that were once imbedded in peoples' minds crept into language and became firmly rooted once it was formalized into written form. Thus, what remains today in any language language provides a snapshot of common usage from the past, which reflects the understanding of those times.

I was excited the morning that these ideas raced through my mind. I then thought about numbers in different languages. The words for the numbers in English are distinct until thirteen, which takes the form of three and ten, etc. Could this show the remnant of a base 12?. This seemed plausible, given the fact that some units of time (12 hours representing half a day and 12 months making a year) seem to have a preference for 12. And don't forget 12 inches to a foot.

I then rattled off the numbers in Ukrainian. It was purely base ten. I had taken French in high school and some in college, but my French got totally erased when I started to learn Italian (except for the time that French got mixed in with my Italian when I said to a French colleague while in Italy "qualchechosa," a hybrid of "qualcosa" (Italian) and "quelque chose" (French)). So, I asked my wife to remind me of how to count to twenty, and the pattern turns out to be the same as in Italian!

Stones may correspond to sheep, but how does one deal with fluids? This is an important quantity when bartering in liquids (as in ordering a beer). At some point, humans must have recognized that liquids are conserved. In other words, they can be moved around and split into smaller amounts, but when recombined, the quantity fills the original container in the same amount.

In the case of liquids and weights, there appears to be a preference for base 2, which makes sense given the ease with which we can split liquids in half over and over again. There are 2 cups to a pint, 2 pints to a quart, 2 quarts to a half gallon, and two half gallons to the gallon. Also, there are 8 ounces to a cup and a pint is 16 ounces (base 16!). And don't forget 16 ounces to a pound. The relationship between weight (or more precisely mass) and volume is clear. 16 ounces of water or beer weighs 16 ounces, or one pint weighs a pound. No wonder!

The birth of the decimal Metric System (base ten) coincided with the French Revolution, when two platinum standards representing the meter and the kilogram were deposited in the Archives de la République in Paris, on 22 June 1799. This was the first step in the development of the present International System of Units, in which the basic units of distance, mass, time and current are the meter, kilogram, second and ampere, respectively.

For convenience, modern day computers use binary, or base two: ones and zeros are easily representable with the gate of a transistor being on (1) or off (0). For programing convenience, the bits (binary digits) are combined into groups of 4 leading to a hexadecimal representation (base 16), where two hexadecimal "digits" can represent the numbers 0 through 255. As such, pairs of hexadecimal numerals are used to describe the alphabet (upper and lower case) the ten base 10 numerals, as well a bunch of special symbols.

The simple act of counting eventually led to the transformation of primitive society into one that can understand the mysteries of nature, which since antiquity had been thought incomprehensible in the absence of deities. Our language, on the other hand, provides clues of the thought processes that went into the development of counting, which forms the basis of mathematics, physics, the sciences, engineering and technology - historically following approximately that order.

After writing this post, I searched Wikipedia for additional information on various bases used by various civilizations in various eras. In the third millennium BCE, the Summarians used a base 60 numeral system, called the sexagesimal system. The symbols used are shown to the right. Incidentally, our system of 60 seconds to the minute and sixty minutes are -- you got it, base 60! But so are angular measurements. There are 60 arc minutes in a degree, and sixty arc seconds in an arc minute. This connection makes sense given that the timing of the sun's apparent motion in the sky is measured as a change in angle over a change in time period.

It is obvious how the base 5, 10, and 20 systems follow from counting with fingers and toes. Thus while the sexagesimal system is base 60, the symbols follow a base 10 pattern. However, an advantage of base 60 is that 60 has a large number of factors (1,2,3,4,5,... you can determine the rest) so that fractions are easier to represent.

According to the Wikipedia article on base 10, peoples using other bases are as follows:

  • Pre-Columbian Mesoamerican cultures such as the Maya used a base-20 system (using all twenty fingers and toes).
  • The Babylonians used a combination of decimal with base 60.
  • Many or all of the Chumashan languages originally used a base-4 counting system, in which the names for numbers were structured according to multiples of 4 and 16.
  • Many languages use quinary number systems, including Gumatj, NunggubuyuKuurn Kopan Noot and Saraveca. Of these, Gumatj is the only true 5–25 language known, in which 25 is the higher group of 5.
  • Some Nigerians use base-12 systems
  • The Huli language of Papua New Guinea is reported to have base-15 numbers. Ngui means 15, ngui ki means 15×2 = 30, and ngui ngui means 15×15 = 225.
  • Umbu-Ungu, also known as Kakoli, is reported to have base-24 numbers. Tokapu means 24, tokapu talu means 24×2 = 48, and tokapu tokapu means 24×24 = 576.
  • Ngiti is reported to have a base-32 number system with base-4 cycles.

So, what became an innocent language lesson on the numbers led to a train of thought that occupied my mind for a morning and gave me the satisfaction of understanding something that was new to me. The fact that I had not learned something new to the world did not bother me a bit.

With all this numerology and talk of ancient number systems, am I worried about 2012? What do you think?

I am finishing this post after our traditional zillion-course Ukrainian Christmas dinner, followed by a glass of eggnog and countless chocolate-covered pretzels, so errors in my logic are undoubtedly plentiful. No apologies!

I wish all of you the best that the holiday season has to offer. Given the international nature of my handful of regular readers, I would be interested in hearing about how you form the words for the numbers in your native languages, and at what point if any, they are irregular. In the meantime, I will be sipping on another eggnog. Good night!

Wednesday, December 21, 2011

A correspondence on the intrinisic hyperpolarizability

I often get correspondence from scientists form all over the world. One such arrived a couple days ago asking about the intrinsic hyperpolarizability and why it is a useful quantity for comparing molecules. Below is the original message and my response:

Email to me:

Dear Sir,

I have a doubt regarding beta-intrinsic value. Which molecule is of greater practical importance, having a greater beta-intrisic value or a greater beta-value? If molecule has greater beta-intrinsic and lesser beta-value as compared to its related counterpart can it be regarded as a better molecule for practical applicability?
Thanks.

Kind regards,
Sincerely,
So-and-so

My response:

Dear Dr. So-and-so,

The intrinsic hyperpolarizability is used to understand the origin of the nonlinear response of a molecule. Making a molecule larger will yield a bigger value of beta; but, the intrinsic hyperpolarizability tells you how effective it is given its size. This kind of understanding can lead to the rational design of better molecules by first identifying ones that have a large intrinsic hyperpolarizability and then making them larger using the same "shape" or theme.

Having a molecule with a large hyperpolarizability in itself is not technological significant because that property alone will not necessarily make it useful in a device. It needs to be incorporated into a material with a large bulk response and then needs to be formed into a device component that is photochemically stable, etc. Thus, a molecule with large beta is not of technological interest without lots of other work to determine other properties; and, a small intrinsic beta makes it less interesting from the point of view of science.

A large beta molecule may be of interest to others if it has other unique properties, such as an ability to attach it to microdots to enhance local electric fields, or if it acts a charge sensitizer in a polymer, etc.


Best,
Mark G. Kuzyk

In the near future, I plan to write a description of our research aimed at the non-expert so (s)he can gain an appreciation of our work, which is based on trying to understand complex properties of a system by looking at large-scale patterns. Stay tuned.

Tuesday, December 13, 2011

A video game from the past.

When I was in high school (circa 1975), I built a computer from a kit. I spent days and nights soldering together components and inserting chips. The final box was similar in size to present-day workstations (click here for a photo and specs). That is where the similarity ends.

In those archaic computers, programs were written in machine language, were entered in binary with toggle switches on the front pane and the results were displayed on a couple dozen light emitting diodes. I was excited just being able to write a program that made the row of light bulbs simulate motion.

My dream was to someday acquire a teletype terminal so that I could type in programs and be able to print them out rather than seeing one line of code at a time in glowing red dots. Worse yet, there was no external storage, so once the computer was turned off, the programs disappeared forever. I had well-organized hand-written sheets of papers in a binder with code so that I could quickly re-enter it with the toggle switches in the future.

A few years later, I was ecstatic that computers for the home were available for the low price of a thousand bucks. They still had no hard drive for storage (I would save up for that later), but they came with a built-in keyboard and a plug for a TV set that acted as the display device. I paid extra to upgrade my RAM from 16 to 48kB - today I have almost a million times that amount of memory. Click here for detailed specs of these computers.

When I first got my Atari 800, I was up for 2 days straight writing code under huge protests from my parents. I also had to plead with my frugal parents not to turn off the computer so that my programs would not disappear.

It took me about an hour to duplicate a simple version of that classic computer game called Pong, with an on-screen paddle that was controlled with a joystick and a ball that bounced around the screen making a sound each time it was deflected. I can still see me my mother playing the game, shouting out curses when she missed that little ball. The Atari 800 was great with sound and video, so it was ideal for video games.

After I got my floppy drive (I think it had a few hundred kilobytes of space - my drives now have a million times more) I decided to write a space-battle type video game, which I originally called Star Trap but later had to change to Stun Trap because of copyright issues (another company had already trademarked that name). Before writing the game, I got documentation on how to call special routines that would activate pixels on the screen for animation. I also got an assembly language compiler so that I could write my code in the ultrafast machine language of the MOS Technology 6502B chip.

I spent a bunch of time coding and eventually started a company named Affine Inc. to manufacture and market Stun Trap. After incorporating, I got family members and friends to invest real cash. With these funds, we produced the game, including artwork for the cover of the box(right) and our advertisements that appeared in computer magazines. At that time, we also opened a computer store and ran a mail-order computer business to generate more cash flow.

The final touch on the software end was my own invention for software protection. The game would only run if our disk was in the drive. We made a precise plastic fixture that would hold the disk in place so that we could puncture it in a specific place (we used a common push pin). The program would then write to the location of the hole and check for a write error. The error indicated it was an official version.

Back then, to get a good unit price on the packaging, we had to order several thousand boxes at once. These filled our tiny condo, as did our inventory until we moved into our computer store, with a backroom for inventory and a small sales desk for the mail order business.

Things moved quickly. After our add appeared, we were picked up by a couple big chains in New Your city who bought enough games to stock each store. The games sold for a retail price of about $25 to $30 and and it cost us about $12 to manufacture.

The big break came when K-Mart was considering buying 50,000 games. The downside was that they expected terms of net 30, which meant that they had 30 days from delivery to pay, so that we would need to borrow $600,000 to supply the product. And, they had the option to return the games if they didn't sell. Luckily, the Atari game market flopped before we decided how to deal with such a potentially large sale and its associated risks. Our game sales dried up to zero overnight, and in the end, I believe we sold less than a few hundred units.

The computer store and mail order business continued operations for quite some time, but I had had enough. My one-year hiatus from grad school was getting me itchy once again to do physics, which I did with a vengeance upon my return. The rest is history.

A few days ago, I chanced upon a website that had a screenshot of my game (shown here) as well as the code to run it on an Xbox (someone had taken the time to transcribe the code). Given the few number of units that were sold, I am curious where they found a copy.

It was fun writing the game and producing a product that lives on via the internet. Hopefully, my present activities will be more fruitful to the citizens that support my passions than this game, which was wonderful when viewed though my eyes, but in the end not a very popular product.

I would be interested in hearing from anyone if they are able to run the game on their Xbox. http://www.atarimania.de/detail_soft.php?MENU=8&VERSION_ID=5117