An Oral and Pictorial History
of Large Scale Scientific Computing
As It Occurred at the
Lawrence Livermore National Laboratory

The Incomparable George Michael

Introduction

by George A. Michael, the curator of this project






I retired from the Lawrence Livermore National Laboratory (LLNL) at the end of 1991, having started there on April 16, 1953. Since 1995, I have been working, part time, on a project aimed at recording the memories of some of those who were involved with the beginnings of computing at the Lab, and with saving materials that might be worthy of a museum setting. One reason for embarking on the project is that hardly any of this original work was published in technical journals; the excuses often depended on some sort of security considerations. In this introduction, I try to provide a sense of what it was like in the early days, as a backdrop for some of the things that were done. I do not intend these to be considered complete or definitive. They are the personal perceptions of the interviewees. It is likely that I missed some people who should have been interviewed. It is also likely that errors of interpretation may have crept in. I accept the blame for that, and note that way has been provided for adding new interviews, and making corrections, or other changes.

I think it is important also to note specifically that this project is not sanctioned or sponsored by the Laboratory. "Tolerated" might be more accurate. In any event, if it's necessary to assign blame, the Lab is innocent. I hope I have preserved the technical details and also showcased the spontaneity and creativity of those who were doing the work. Clearly, it will be the judgement of the readers as to how successful I was.

Before continuing with this introduction, I want to acknowledge all the persons whose assistance was absolutely vital in producing this collection of memories.

Acknowledgments and Disclaimers



This project has been more difficult than expected for reasons that seem clear now, but weren't initially. I started the audio recording of the memories of some of the early computational activities with the intention of providing the tapes as grist for some real historical analysis by some real historians. Initially, I had some Lab support, through the courtesy of Chris Hendrickson, my immediate supervisor, who thought that the interviews should be typed and published as Laboratory reports. During these first efforts, my colleagues-in-crime were Catherine Williams who did the typing and served as an editor and Jerry Owens who further proposed that the interviews be made available at a "Virtual Museum" Internet site. This is the plan we have followed. But, after Chris retired, the funding quickly dried up and, for the last several years, the entire project has had to rely only on volunteer help. It was in this period that I finally began to understand the difference between what I thought of Computation, and what Laboratory policy was. I'll return to this point later.

These volunteers, friends and colleagues, are vital to the completion of the project. Initially, I started with the excellent assistance of Catherine Williams who not only produced the first transcriptions, but also edited the first dozen or so of the interviews. A medical condition forced her to retire and we lost her services thereby. Some time later, SharonLee Danielson, a former secretary and administrative assistant who has since retired, volunteered to take on the chore of typing. Additionally she has organized the files and other records so that, thanks to her, we can tell at a glance how things are progressing.

Sam Coleman, also a retiree, designed the file organization and is converting the edited typescripts into documents capable of being viewed over the Internet. This includes pictures, and other reference material. Tom DeBoni, now at LBNL/NERSC and formerly at Livermore, has managed the chore of actually posting the material at a web site. Once the interviews are posted, there is a more or less continuing edit phase in which changes and corrections of all sorts are considered. James (JED) Donnelley, also formerly at Livermore and now at NERSC, has led this effort. We are grateful to Horst Simon, the director of NERSC for hosting our collection of interviews. The Laboratory archivists, Beverly Bull and Steve Wofford, have been indispensable informal sources for some of the hard-to-find items that relate to this project, and also inspirational supporters of this effort.

I have one additional remark: my memory is not so good that everyone who should have been remembered was. The same is true for the reported events and chronology. All dates and sequences are subject to corrections that may appear from time to time. For all mistakes I recommend the best procedure is, "Don't blame Michael." You could be kind and forgive him, but much better would be to correct him.

The Structure of this Oral History



This project contains interviews of some of those who were involved with the formation and development of the Computation Department at the Lawrence Livermore National Laboratory during the years 1950 to 1975. For the sake of continuity, some events go beyond 1975.

Many consider these years the golden years within computing, and certainly as it evolved at the Laboratory. Generally there were hardly any rules or other useless administrative stuff. People were free to work with the minimum of disruptions.

During this time, outsiders throughout the world recognized computer usage at the Laboratory as innovative and pioneering in applying large computers to a huge range of problems. The emphasis to be noted is on largeness. In the early days, when software was bundled and freely distributed, the programs we got were often rather inadequate; the things simply didn't fit well with the large programs we were developing. There were limits on array sizes and tables; even program sizes. Such programs lacked a certain kind of toughness and flexibility, or imposed limits of one sort or another that made them simply unusable. So, of necessity, we came to depend on software developed at the Laboratory. Our programs weren't necessarily better than the versions we could get from outside sources, but they were more robust and they did allow the users much more flexibility in their applications program development.

Notwithstanding, we enjoyed and benefited from a great camaraderie with colleagues throughout the country, and especially with those in the Bay Area.

Let me note in passing that the attitude within Lab management was that the only thing of importance was Nuclear Research (read Weapons Design). Everything about computing, no matter how important it was to computing, was hardly worth recognizing in its own right. Nothing was allowed to detract from the urgency of nuclear design, at least as perceived by the AEC headquarters in Washington. This singleness of purpose produced the finest weapons research laboratory in the world even while there arose certain strains between the weapons designers and the creators of the tools they used. For many in the orgy of computational discovery, this didn't matter. Weapon design was, after all, the reason we were hired. Even though it may have been necessary, we weren't hired to specialize in creative methods for using computers. Notwithstanding it was fun doing these things. If the managers chose to regard the computers as nothing more than fast slide rules, that was their problem. We were having fun.

In any event, we seemed to be given enough money to get things done, and challenges such as they were, didn't daunt the staff. Being inexperienced, many never noticed how hard some things were to get done, except that most projects were perennially late. Many cynics (realists) argued that the Lab was a spoiled rich brat. "Anybody else with that much money would have done as well, or better." As far as I'm concerned, the right response to this is, "While you may be correct, history doesn't record its alternatives."

I can say with a great amount of certainty that we had three, possibly four, crucial stimuli. From observation, most other research centers lack one or more of these. We had the money, we had good people, we had real applications and, finally, we had a management smart enough to not interfere as long as the programmatic work was getting done. Moreover, most of the work was hardly impeded by pauses for writing and publishing reports.

In fact, that was one of the major problems. Not much was reported or published. Since publishing did little for nuclear design, it wasn't important; no brownie points were available for publishing or for such things as teaching classes about programming.

I don't wish to imply that everything was perfect, and I certainly don't wish to suggest that everything written was good stuff. A great deal of the output of computation programmers was comfortably pedestrian. As valuable as it was for nuclear design, I have not tried to capture any of those results in these memoirs. First and foremost, I want the interviews to convey some of the ambience of those days, and highlight some really neat things characterized by good vision, clever implementation, or things that really served user needs.

Finally, I don't intend to imply that we always enjoyed unanimity amongst the staff. There was a lot of arguing. There were big egos and bigger tempers. There were big arguments about direction, false starts, missed goals, and late delivery dates. The path was hardly ever smooth or well marked, nor was consensus the usual situation. But when things worked out, there was an unmistakable sense of accomplishment and pleasure.

For the group doing computation, two other things contributed to the stimulating atmosphere of cooperation and free inquiry. First, the excitement of discovering new things or better ways of doing things for the first time; it is well known that there is a great pleasure doing things that others say can't be done. The second factor, especially for the more junior staff members, was meeting and interacting with senior scientists who routinely visited the Laboratory. Generally, we knew about such persons by reading about their work while we were students; persons such as George Gamow and John Von Neumann. For such reasons, working at the Lab was the best intellectual high many of us ever experienced. It was fun!

As far as I'm concerned, this oral history is not a history. Rather it is a collection of war stories, some uncoordinated sequences of events as seen by individuals. I have tried to ensure that names and dates are correct and appear in the correct context throughout the interviews, but otherwise the stories belong to those being interviewed. Eventually, I will try to provide a "cast of characters". I wouldn't be surprised if a careful reader uncovers some inconsistencies between narratives. No big effort was made to catch such things. I will, of course, be grateful for any alternate versions readers may wish to provide. Within reason, these will be fully published, with attributions.

Before continuing, and just this once, I would like to mention some gratuitous thoughts about the Lab's administration, because I still feel stymied by what seems to be management blindness both at the Lab and at DOE Headquarters. My comments might also be interpreted as my attempt to understand what went on and, specifically, no one else should be blamed for them.

I've already alluded to the management bias toward nuclear design. The reader must not consider this wrong. The lab was founded to do nuclear design. Practically all early senior managers thought of computers as slide rules or adding machines. It was mostly through use that alternate visions of computational potential became obvious.

At the beginning, and time after time thereafter, I felt many key decisions made about the Computation Department activities were plainly wrong, or at least lacked a vision of the potential benefits. Even more to the point, while many of the computational achievements truly helped weapons design and research, they were never suitably acknowledged. For example, there is no doubt that Time Sharing was of singular importance in exploring the nooks and crannies of nuclear design, but the designers after initially fighting it, never acknowledged its value except of course, by using it.

It is generally true that the Lawrence Radiation Laboratory was founded largely because of the activities of Edward Teller and, without question, the lab was not also intended to be a research facility for computation. Indeed, the oft-stated goal was to advance thermonuclear weapons research, deemed to be absolutely necessary because the Russians had produced their first nuclear device some five years earlier than our pundits had predicted. So, as has been reported elsewhere, Dr. Teller was unhappy with the lack of progress at Los Alamos in developing the so-called Super. Part of the problem was that no one knew how to design such a device and some of the AEC felt it shouldn't be tried at the expense of not improving the designs of the ordinary fission weapons. Finally, a promising way was proposed by Teller, based on a seminal suggestion of S. Ulam, a mathematician at Los Alamos. Even then, Teller felt that progress was too slow. A second laboratory was deemed necessary, and a new contract between the Atomic Energy Commission and the University of California was signed that led to the formation of what is now called the Lawrence Livermore National Laboratory (LLNL).

Among the many consulting scientists that advocated computing support as a necessary part of weapons design, was John Von Neumann, who aided the original staff in acquiring its first computer, the UNIVAC 1 in 1952. But note well, it was computer support, not research. The general attitude was to tolerate some computer research because it was needed for nuclear design support. Exploring this new computational space was supposed to go on at other places.

So, computing was a tool for the senior physicists and, definitely not, in any scholarly way, a basis for a new kind of scientific development. From the beginning, computer research was tolerated, and indulged only to the extent that it advanced the major programmatic goals of the weapons projects. Senior scientists at the Lab were allowed six hours a week to pursue their own research interests. This benefit was not extended to the programmers. Initially, programming, or coding, as it was then known, was not even considered a profession. It was mostly through the efforts of Sid Fernbach, the first director of computation, that the U.S. Department of Labor was prevailed upon to recognize a new professional classification. To be paid as professionals, computer programmers were not just coders or computists; they were reclassified as "Mathematical Programmers / Computer Scientists."

The problem was that much more than mathematical algorithms was needed to usefully exploit computers, including the development of discrete mathematics, operating systems, program libraries, storage devices, output procedures including visualization, performance analysis, and so on. Much of the stuff either didn't exist, and what was available from other places was inadequate. The de facto solution of this problem was to allow the staff to produce what was needed. Then, the new problem that arose was how to stop the staff from continuing with this fun research after they had satisfied the designers' immediate needs.

In any event, the Computation Department started to live a life of its own because it was producing other stuff needed by the designers, even when they didn't know that such things were useful. For example, the physicists initially wrote their "design codes" in octal because there wasn't any alternative, and the initial programs were very simple. As the design programs grew in size and complexity, other methods were needed. Even so, when assemblers were made available, the designers were slow to use them. Similarly, when compilers arrived on the scene, the designers were again slow to use them. The same was true for timesharing, mass storage, graphics, and Ö, but you get the idea; computational progress was rapid, but adoption was slow. The designers were not unreasonable. Their job was to do nuclear design, not "waste a lot of time" improving the tools, especially when most of the stuff wasn't user-friendly. This is understandable and, initially, the weapons divisions could justify their results, making it easy to ignore what some of the programmers were doing.

As the Laboratory aged, the willingness to allow the computists such free rein began to dissipate. Even though the needs did not go away, the weapons divisions more and more advocated purchasing the things they needed rather than having to wait a long time while the Computations staff produced it because, they said, it was cheaper. It may have indeed been cheaper, but it didn't necessarily always work properly, so really talented people were being turned into baby sitters for other persons' work. Not good.

In many instances, the weapons divisions were right, but one consequence was that innovation began to disappear from computation. Today it is largely gone, even though the budgets have not shrunk. Even during the golden years, the weapons divisions, controlling as they did at least 85% of the computation budget, discouraged the very developments that made the early Laboratory a computational showplace throughout the world. Another, perhaps unintended, consequence was that not much scholarly publishing was done. So few outside the laboratory knew about the development of the Television Monitor Display system, the timesharing systems, the File Storage and Management systems, and many other things that are discussed in these memoirs. Face it, Livermore was primarily devoted to excellent nuclear weapons design; computational developments had to take a back seat.

It makes no sense for this project to try, now, to do such publishing. However, the events, as seen by those who were there, form a collection of authentic war stories worthy of being aired. These are what I set out to collect, from persons who functioned in one or more of the following categories:

  1. General Computing Systems (Software and Hardware);
  2. Applications Programming;
  3. General Administrative activities;
  4. I/O, especially Networking, Storage and Graphics.


As you will see, a list of computer acquisitions makes a good skeleton on which other things can be hung. Here is a short chronology, accurate to the best of my knowledge, of large computer acquisitions between 1950 and 1978, that should help place the interviews in a useful context. Small computer and peripheral procurements are briefly considered later in this introduction. It's a good thing to keep in mind that all the computers listed below were selected largely because it was claimed they were the fastest things then available, and that's what the designers kept demanding: faster and faster computers. In the listing that follows, I've interspersed some gratuitous comments. This makes the introduction overly long, but may highlight the reasons for choosing these particular computers.



The UNIVAC was a serial computer using decimal numbers. It was more than a computer. It was a (complete) system that addressed the needs of input and output, storage, and arithmetic. It had a 1,000-word memory and each word held twelve decimal digits. Being a serial computer meant the UNIVAC performed all its internal operations, including arithmetic on one digit at a time. It had built-in circuitry to check periodically, the entire contents of its memory and all its registers for errors. If errors were detected, the machine halted so that corrections could be made.



The IBM 701 was a parallel computer using the binary number system. Except for the UNIVAC and the LARC, all other computers at the Lab used the binary number system. The 701 had a 4,096-word memory, each word being 36-bits; equivalent to almost 12 decimal digits. The parallel feature means the machine did most of its internal operations including arithmetical on whole words. This helped the machine to be much faster than the UNIVAC. The 701 used Williams CRT tubes as the memory organ. This made the machine very unreliable. It freely made mistakes, but never reported them. This was fundamentally different from the UNIVAC. However, all the computers in this list used a parallel organization, and all benefited from improved components, so errors became less of an issue as time went on.



The main virtue of the 704 was a larger and improved memory based on ferrite cores, and hardware support for floating point arithmetic. It was faster than the 701 and vastly more reliable. The 704 introduced the use of Index Registers to facilitate addressing. One of the major accessories was a CRT system that allowed direct viewing and/or recording of data on 35mm filmstrips.



The 709 was quite similar to the 704, but with a much improved method for doing input/output operations. It also accommodated larger memories. A CRT system similar to that on the 704 was tried, but it was both too cumbersome, and slow. All subsequent graphics work was carried out on specially designed units that could be, if desired, attached to a main frame.



The 7090 was IBM's first 7000 series machine to use transistors. This allowed a large increase in speed The machines supported a memory of 32,768 words (32K). As with all of these IBM computers, each word was 36-bits in length.



The LARC (Livermore Automatic Research Calculator) was a decimal machine and the first designed to use transistors, although it was not the first transistorized.computer delivered to the lab. More importantly, it introduced at least two architectural innovations that have persisted in other large computers. One was the separation of input/output controls from the main CPU and the other was the introduction of Register Files for rapid context switching. The big speed increase resulting from the use of transistors while real, was overshadowed by the earlier arrival of the 7090 which provided most of the expected increase in speed. The LARC also introduced an Electronic Page Recorder (EPR) that permitted recording text and graphs onto 35mm film, but the attempt to process the film automatically was not successful. The LARC, although very late, met or exceeded all its design goals. It ran practically error free, and it was the last computer to be based on the decimal number system.



The 7094 was the culmination of the IBM 7000 series. It used the memories designed originally for the IBM 7030, and was almost twice as fast as the 7090. The similarity of the architectures in the 7000 series made migration through successive models relatively easy.



The 7030 was IBM's response to the LARC project. It was, perhaps, overly elaborate, and like the LARC, very late in delivery. Again the expected speed increases were overshadowed by the earlier introductions of transistorized computers - 7090, LARC, 7094, and so on. The 7030 provided the largest memory for any machine then in use at the Laboratory. It used 72 bit words of which the last 8 bits were for ECC (Error Correction and Control.)



The 1604 was one of the first computers designed by Seymour Cray. It used a 48-bit word, and was acquired to study architecture different from the IBM designs then in use at the Laboratory and to explore variations in numerical precision. Additionally many programmers considered the I/O procedures of the CDC computers easier to use.



The 3600 was also acquired as part of a plan to make an orderly transition to the CDC 6600. Memory size on these CDC machines was 32k. The CDC computer words used 48 bits while the IBM 7000 series used 36 bits. This obviously allowed more numerical precision. Studies had now begun to determine how much precision was actually needed in the big design programs. The batch operating system used on these machines was considered easier to use than that on the IBM machines.



The 6600 was considered the first Supercomputer. It was the fastest computer in existence, and provided the largest memory. It fully exploited the flexibility of separate computation and input/output processors. Its arithmetic provided for the largest numerical precision. A separate set of ten peripheral processing units (PPUs) managed all I/O operations. The memory contained 128,000 sixty-bit words giving the 6600 the largest memory operating at a cycle time of one microsecond; about the fastest memory then available. Actually, at this speed, there were some of memory errors, but the machine was sufficiently fast so one could introduce suitable strategies to catch the errors, rerun all or part of the calculation and still end up running faster on this machine than on any other [1].



Seymour produced an interim computer called the 6800, but it never satisfied his goals, so it never left his computer lab. He then designed and delivered the 7600, which was his third Supercomputer design. He preserved some of the architecture of the 6600 and introduced some radical new things. The I/O control was reintegrated into the CPU, and more radically, the memory was divided into two parts. The fastest was called the Small Core Array and consisted of 65,000 words each with 60 bits. The larger memory held 500,000 words that were accessed as super words (SWORDS) 480 bits at a time, which then could be treated as eight 60-bit words. This Large Core Array was slower than the SCM, but it did give access to super words (SWORD). Granted that memory technology was somewhat primitive, this memory bifurcation was an idea used to control costs, but programmers found it difficult to manage. Even so, the 7600 was considered about 5 times as fast as its predecessor, the 6600. (The first 7600, serial number 1, remained in production for twenty years. It was retired in 1989 despite screams from the users who loved it. Thus ended an era not likely to be repeated at LLNL.)



The STAR 100 was the first vector machine in design, but because of its other complexities, was not the first to enter the marketplace. Vector computing was then rather new, and difficult in adaptation. A major problem on the STAR was the disparity in speeds between vector and scalar computations. The STAR provided the largest memories in use at the Lab: one or two million words. They were used as time sharing servers, but it was very hard to do that and efficiency suffered. They were late in delivery. It was difficult to produce efficient programs, and generally failed to make an impression on the marketplace. One of the designers at the Lab remarked that his team had spent seven years to gain mastery of the STAR, and that was seven years of not improving the physics, and he was never going to do that again.



The Laboratory did not get serial 1 of the CRAY 1, and it was at least two years after the STAR adventures before a new computer was allocated to the Lab. The word size for the CRAY was 64 bits, and up to 2 million words of memory was allowed. As with all Cray designs, there really wasn't any upward compatibility in the order codes of the machines, but all of them rewarded the programmer with increased speed for careful programming. Upward compatibility was not a goal of Seymour Cray's designs; speed was [2]. It is worth noting here that until the Cray 1 era and even for some time thereafter, the IEEE standard for doing computer arithmetic had not achieved general acceptance, and therefore each conversion from one machine to another had to take this into account.


Clock Rate (Hz)

Specs of Livermore's First 36 Major Computer Systems

Cycle Times (nanoseconds)

Instruction Issues per Cycle
 
Peak System Rates (MegaFlops/sec.)


Keep in mind that, in the earliest interval 1950 to 1960, the computer milieu was being born. As we learned later, lots of original work was simultaneously being developed at other places like Manchester University in Manchester, England, at MIT, in Cambridge, MA, and the Moore School of Engineering at the University of Pennsylvania in Philadelphia, PA, to name just a few. However, the results did not get generally circulated. For one thing, there was hardly anyone able to receive such news [3]. There was no computing fraternity, and no large collection of books about computing. Getting started meant each place developed a lot of the ad hoc stuff needed for using the early machines. In the beginning there was a lot of emphasis on speed, like fast ways to extract square roots. Or compute the distance between two points without taking a square root. Or devise a way to start loading a new program without destroying most of the previous memory contents; really basic things having just two factors in common; everything seemed NEW and very necessary. No matter how often they were discovered, it was always a thrill for the inventor. Both beauty and further ideas awaited the adventurer/programmer. It was a time when cleverness was a lot of fun.

Let me turn now briefly to a few questions relating to small computers. In 1952, Sid Fernbach was lured back from a postdoctoral year at Stanford by Edward Teller. He took over the mechanics of acquiring the UNIVAC 1, and subsequently, he was named to head a Computer section as a part of the Theoretical Physics Department. For our purposes here, a small computer then was characterized by three factors:

  1. They utilized relatively slower components for arithmetic and logic,
  2. They had, initially, relatively small memories,
  3. They were deliciously cheaper.

In any event, all computer procurement was assigned to Sid. It was understood by everyone that computers were to be used in weapons design. Small computers didn't do weapons design; they could other things for small groups or individuals. This attitude persisted long after they started to become common. But small or otherwise, they were the responsibility of Sid. Since they clearly couldn't be used in nuclear weapons design, Sid set up a group to oversee their procurement, and most of these machines ended up as controllers of one sort or another, for example, as controllers for laboratory instruments. A principal effect was that Sid personally gave them little attention other than to ensure that the Small Computer group was staffed with competent and acceptable people who assisted other Lab staff in procuring these small machines.

In retrospect, this may have been a strategic error. In a short time, the combined expense of small computer procurement exceeded that for large computers and, of course, given that the budget was relatively static, what the small purchasers got was taken in part from the budget for large computers. Clearly, this caused some stresses to develop. The other contributor to this stress was that the small machines became incredibly more powerful and capacious. Presently one can pay around $3,000 for a machine that is faster than the machines we used in the 1970s costing almost $10 million dollars. And, moreover, these "small" machines have at least 25 times as much memory. When once this was pointed out to Sid, someone remarked, "Le Coût en ôte Le Goût.," or something like that.

Recapitulating, in this introduction, I have tried to point out that we are not really recording history; we are telling war stories from which someday may emerge a real history. These stories record the personal memories of many at the Lawrence Livermore National Laboratory who developed computing in the early years. Additionally, I have tried to discuss the attitudes of some of the players and the machines they used, with due regard for the Monday-morning-quarterback effect.

As the reader will note, the interviews cover both hardware and software as it was evolving from 1950 to 1978. I have chosen 1978 because I believe that, with the playing out of the STAR adventure, the Computation experience at the Lab went into a kind of eclipse. If, now twenty-five years later, it is emerging, that will be a wonderful story for others to tell.

I invite you to enjoy our efforts, always remembering that we actually look forward to your comments and corrections. Just don't make the knives too sharp.




[1] Cray's commitment to speed had a troubling consequence in the first 6600. Whenever a floating-point calculation gave zero result, to save a cycle, the register was not cleared completely; the exponent was left with a legal value and the fraction had zeros. So, a test for zero always failed. This has awkward consequences for programs that depended on testing for zero in order to terminate a loop, or to branch to some auxiliary code. The simple solution was to fully clear the result register. The reason Cray didn't do that was that it would add one cycle to the time. Eventually, he was persuaded to do that. The effect was dramatic; programs were sped up enormously. Seymour noted it cost an extra cycle that could have been avoided had we been willing to program around what he thought of as an idiosyncrasy. Everyone had a good laugh at this. Other than that "feature", the 6600 was a very highly regarded machine by all our users.

[2] Seymour's taste for speed caused him to do certain things to floating point arithmetic that sacrificed at least one significant digit in the results. For example, he rounded the dividend before performing the division. Needless to say this was anathema to the mathematical purists in the user community. Those who were not so pure either ignored this or found ways around such problems.

[3] It is true that the Association for Computing Machinery (ACM) was active in the sponsoring of specialist meetings at which such specialist knowledge could be shared. It is also true that many of the users hardly ever attended such meetings.