Saturday, August 31, 2019

Introduction to Computer Theory

CHAPTER 1 BACKGROUND The twentieth century has been filled with the most incredible shocks and surprises: the theory of relativity, Communist revolutions, psychoanalysis, nuclear war, television, moon walks, genetic engineering, and so on. As astounding as any of these is the advent of the computer and its development from a mere calculating device into what seems like a â€Å"thinking machine. † The birth of the computer was not wholly independent of the other events of this century.The history of the computer is a fascinating story; however, it is not the subject of this course. We are concerned with the Theory of Computers, which means that we form several abstract mathematical models that will describe with varying degrees of accuracy parts of computers and types of computers and similar machines. Our models will not be used to discuss the practical engineering details of the hardware of computers, but the more abstract questions of the frontiers of capability of these mec hanical devices.There are separate courses that deal with circuits and switching theory (computer logic) and with instruction sets and register arrangements (computer ar-chitecture) and with data structures and algorithms and operating systems and compiler design and artificial intelligence and so forth. All of these courses have a theoretical component, but they differ from our study in two basic ways. First, they deal only with computers that already exist; our models, on 3 4 AUTOMATA THEORY the other hand, will encompass all computers that do exist, will exist, and that can ever be dreamed of.Second, they are interested in how best to do things; we shall not be interested in optimality at all, but rather we shall be concerned with the question of possibility-what can and what cannot be done. We shall look at this from the perspective of what language structures the machines we describe can and cannot accept as input, and what possible meaning their output may have. This descripti on of our intent is extremely general and perhaps a little misleading, but the mathematically precise definition of our study can be understood only by those who already know the concepts introduced in this course.This is often a characteristic of scholarship—after years of study one can just begin to define the subject. We are now embarking on a typical example of such a journey. In our last chapter (Chapter 31) we shall finally be able to define a computer. The history of Computer Theory is also interesting. It was formed by fortunate coincidences, involving several seemingly unrelated branches of intellectual endeavor. A small series of contemporaneous discoveries, by very dissimilar people, separately motivated, flowed together to become our subject.Until we have established more of a foundation, we can only describe in general terms the different schools of thought that have melded into this field. The most obvious component of Computer Theory is the theory of mathematic al logic. As the twentieth century started, mathematics was facing a dilemma. Georg Cantor (1845-1918) had recently invented the Theory of Sets (unions, intersections, inclusion, cardinality, etc. ). But at the same time he had discovered some very uncomfortable paradoxes-he created things that looked like contradictions in what seemed to be rigorously proven mathematical theorems.Some of his unusual findings could be tolerated (such as that infinity comes in different sizes), but some could not (such as that some set is bigger than the universal set). This left a cloud over mathematics that needed to be resolved. David Hilbert (1862-1943) wanted all of mathematics put on the same sound footing as Euclidean Geometry, which is characterized by precise definitions, explicit axioms, and rigorous proofs. The format of a Euclidean proof is precisely specified. Every line is either an axiom, a previously proven theorem, or follows from the lines above it by one of a few simple rules of in ference.The mathematics that developed in the centuries since Euclid did not follow this standard of precision. Hilbert believed that if mathematics X'ere put back on the Euclidean standard the Cantor paradoxes would go away. He was actually concerned with two ambitious projects: first, to demonstrate that the new system was free of paradoxes; second, to find methods that would guarantee to enable humans to construct proofs of all the true statements in mathematics. Hilbert wanted something formulaic-a precise routine for producing results, like the directions in a cookbook.First draw all these lines, then write all these equations, then solve for all these points, and so on and so on and the proof is done-some approach that is certain and sure-fire without any reliance BACKGROUND 5 on unpredictable and undependable brilliant mathematical insight. We simply follow the rules and the answer must come. This type of complete, guaranteed, easy-to-follow set of instructions is called an a lgorithm. He hoped that algorithms or procedures could be developed to solve whole classes of mathematical problems.The collection of techniques called linear algebra provides just such an algorithm for solving all systems of linear equations. Hilbert wanted to develop algorithms for solving other mathematical problems, perhaps even an algorithm that could solve all mathematical problems of any kind in some finite number of steps. Before starting to look for such an algorithm, an exact notion of what is and what is not a mathematical statement had to be developed. After that, there was the problem of defining exactly what can and what cannot be a step in an algorithm.The words we have used: â€Å"procedure,† â€Å"formula,† â€Å"cookbook method,† â€Å"complete instructions,† are not part of mathematics and are no more meaningful than the word â€Å"algorithm† itself. Mathematical logicians, while trying to follow the suggestions of Hilbert and st raighten out the predicament left by Cantor, found that they were able to prove mathematically that some of the desired algorithms cannot exist-not only at this time, but they can never exist in the future, either. Their main I result was even more fantastic than that.Kurt Godel (1906-1978) not only showed that there was no algorithm that could guarantee to provide proofs for all the true statements in mathematics, but he proved that not all the true statements even have a proof to be found. G6del's Incompleteness Theorem implies that in a specific mathematical system either there are some true statements without any possible proof or else there are some false statements that can be â€Å"proven. † This earth-shaking result made the mess in the philosophy of mathematics even worse, but very exciting.If not every true statement has a proof, can we at least fulfill Hilbert's program by finding a proof-generating algorithm to provide proofs whenever they do exist? Logicians bega n to ask the question: Of what fundamental parts are all algorithms composed? The first general definition of an algorithm was proposed by Alonzo Church. Using his definition he and Stephen Cole Kleene and, independently, Emil Post were able to prove that there were problems that no algorithm could solve. While also solving this problem independently, Alan Mathison Turing (1912-1954) developed the concept of a theoretical â€Å"universal-algorithm machine. Studying what was possible and what was not possible for such a machine to do, he discovered that some tasks that we might have expected this abstract omnipotent machine to be able to perform are impossible, even for it. Turing's model for a universal-algorithm machine is directly connected to the invention of the computer. In fact, for completely different reasons (wartime code-breaking) Turing himself had an important part in the construction of the first computer, which he based on his work in abstract logic.On a wildly differ ent front, two researchers in neurophysiology, Warren 6 AUTOMATA THEORY Sturgis McCulloch and Walter Pitts (1923-1969), constructed a mathematical model for the way in which sensory receptor organs in animals behave. The model they constructed for a â€Å"neural net† was a theoretical machine of the same nature as the one Turing invented, but with certain limitations. Mathematical models of real and abstract machines took on more and more importance.Along with mathematical models for biological processes, models were introduced to study psychological, economic, and social situations. Again, entirely independent of these considerations, the invention of the vacuum tube and the subsequent developments in electronics enabled engineers to build fully automatic electronic calculators. These developments fulfilled the age-old dream of Blaise Pascal (1623-1662), Gottfried Wilhelm von Leibniz (1646-1716), and Charles Babbage (1792-1871), all of whom built mechanical calculating devic es as powerful as their respective technologies would allow.In the 1940s, gifted engineers began building the first generation of computers: the computer Colossus at Bletchley, England (Turing's decoder), the ABC machine built by John Atanosoff in Iowa, the Harvard Mark I built by Howard Aiken, and ENIAC built by John Presper Eckert, Jr. and John William Mauchly (1907-1980) at the University of Pennsylvania. Shortly after the invention of the vacuum tube, the incredible mathematician John von Neumann (1903-1957) developed the idea of a stored-program computer.The idea of storing the program inside the computer and allowing the computer to operate on (and modify) the program as well as the data was a tremendous advance. It may have been conceived decades earlier by Babbage and his co-worker Ada Augusta, Countess of Lovelace (1815-1853), but their technology was not adequate to explore this possibility. The ramifications of this idea, as pursued by von Neumann and Turing were quite pr ofound. The early calculators could perform only one predetermined set of tasks at a time.To make changes in their procedures, the calculators had to be physically rebuilt either by rewiring, resetting, or reconnecting various parts. Von Neumann permanently wired certain operations into the machine and then designed a central control section that, after reading input data, could select which operation to perform based on a program or algorithm encoded in the input and stored in the computer along with the raw data to be processed. In this way, the inputs determined which operations were to be performed on themselves.Interestingly, current technology has progressed to the point where the ability to manufacture dedicated chips cheaply and easily has made the prospect of rebuilding a computer for each program feasible again. However, by the last chapters of this book we will appreciate the significance of the difference between these two approaches. Von Neumann's goal was to convert th e electronic calculator into a reallife model of one of the logicians' ideal universal-algorithm machines, such as those Turing had described.Thus we have an unusual situation where the advanced theoretical work on the potential of the machine preceded the demonstration that the machine could really exist. The people who first discussed BACKGROUND 7 these machines only dreamed they might ever be built. Many were very surprised to find them actually working in their own lifetimes. Along with the concept of programming a computer came the question: What is the â€Å"best† language in which to write programs?Many languages were invented, owing their distinction to the differences in the specific machines they were to be used on and to the differences in the types of problems for which they were designed. However, as more languages emerged, it became clear that they had many elements in common. They seemed to share the same possibilities and limitations. This observation was at f irst only intuitive, although Turing had already worked on much the same problem but from a different angle. At the time that a general theory of computer languages was being developed, another surprise occurred.Modem linguists, some influenced by the prevalent trends in mathematical logic and some by the emerging theories of developmental psychology, had been investigating a very similar subject: What is language in general? How could primitive humans have developed language? How do people understand it? How do they learn it as children? What ideas can be expressed, and in what ways? How do people construct sentences from the ideas in their minds? Noam Chomsky created the subject of mathematical models for the description of languages to answer these questions.His theory grew to the point where it began to shed light on the study of computer languages. The languages humans invented to communicate with one another and the languages necessary for humans to communicate with machines s hared many basic properties. Although we do not know exactly how humans understand language, we do know how machines digest what they are told. Thus, the formulations of mathematical logic became useful to linguistics, a previously nonmathematical subject. Metaphorically, we could say that the computer then took on linguistic abilities.It became a word processor, a translator, and an interpreter of simple grammar, as well as a compiler of computer languages. The software invented to interpret programming languages was applied to human languages as well. One point that will be made clear in our studies is why computer languages are easy for a computer to understand whereas human languages are very difficult. Because of the many influences on its development the subject of this book goes by various names. It includes three major fundamental areas: the Theory of Automata, the Theory of Formal Languages, and the Theory of Turing Machines.This book is divided into three parts correspondi ng to these topics. Our subject is sometimes called Computation Theory rather than Computer Theory, since the items that are central to it are the types of tasks (algorithms or programs) that can be performed, not the mechanical nature of the physical computer itself. However, the name â€Å"computation† is also misleading, since it popularly connotes arithmetical operations that are only a fraction of what computers can do. The term â€Å"computation† is inaccurate when describing word AUTOMATA THEORY processing, sorting and searching and awkward in discussions of program verification. Just as the term â€Å"Number Theory† is not limited to a description of calligraphic displays of number systems but focuses on the question of which equations can be solved in integers, and the term â€Å"Graph Theory† does not include bar graphs, pie charts, and histograms, so too â€Å"Computer Theory† need not be limited to a description of physical machines but can focus on the question of which tasks are possible for which machines.We shall study different types of theoretical machines that are mathematical models for actual physical processes. By considering the possible inputs on which these machines can work, we can analyze their various strengths and weaknesses. We then arrive at what we may believe to be the most powerful machine possible. When we do, we shall be surprised to find tasks that even it cannot perform. This will be-our ultimate result, that no matter what machine we build, there will always be questions that are simple to state that it cannot answer.Along the way, we shall begin to understand the concept of computability, which is the foundation of further research in this field. This is our goal. Computer Theory extends further to such topics as complexity and verification, but these are beyond our intended scope. Even for the topics we do cover-Automata, Languages, Turing Machines-much more is known than we present he re. As intriguing and engaging as the field has proven so far, with any luck the most fascinating theorems are yet to be discovered.

Friday, August 30, 2019

How Far Does the Poet Want Us to Symphatize with Miss Havisham Essay

How Far Does the Poet Want Us to Sympathize with Miss Havisham? The poet wants us to sympathized Miss Havisham greatly, but not entirely. Her own trappings of her strong need to revenge and her morbid existence that has destroyed her carries a symbolic meaning of self – absorption and destruction. This poem introduces us to Miss Harvisham’s character, who has become a type of embittered woman who was disappointed in love and enjoys withdrawing from the world. First of all, this poem is written in a first person’s point of view. She begins by telling the reader the cause of her pain and suffering – her â€Å"beloved sweetheart bastard† which gravitates into a sense of bitterness and vengeance/retribution. In addition to that, the use of oxymoron in the above-said phrase indicates a contradiction of words. The words â€Å"beloved† and â€Å"sweetheart† indicates a very admirable personality, but the word â€Å"bastard† gives us a completely conflicting quality. Besides, she tells us that she not only wished him to be dead, but instead she prayed for his death, evidently by â€Å"Not a day since then I haven’t wished him dead. Prayed for it†¦Ã¢â‚¬  She prayed so hard that she had â€Å"dark green pebbles for eyes and ropes on the back of my hands she could strangle with.† She uses metaphors here to explain to us that while she prayed, she had her eyes shrunk hard and felt that her hands were strong enough to strangle someone, which fits her murderous personality. It makes us feel piteous for her as seeing that she has suffered a great amount until it has reached insanity, but at the same time it makes us feel really disturbed by her mad identity. The second stanza symbolises her â€Å"self-absorption† and â€Å"self-pity† behaviours. She started off with a strong word: â€Å"Spinster†. The use of caesura in beginning of this stanza shows how much she emphasizes and detests this word. Besides, the feeling of abhorrence has been further strengthen by the use of trochee can be seen in the word â€Å"Spinster† as the first syllable is stressed. Moving on, Miss Havisham is also aware of her own stink as she does not ever change her clothes. This show how withdraw she is from the world. Moreover, she stays in bed all day and â€Å"caws† in denial, which shows how she was on the verge of irrationality and stupidity. In the end of this stanza, she ended with â€Å"who did this†. She knows very well that she was a big cause to this problem, but I feel that she also wanted to put the blame on the ex-fiancà © as she only completed her question in the next stanza. This stanza makes us feel really sorry for her seeing  that she cannot get over her past as it keeps haunting her. In the third stanza, she started to dream about her lost lover in a tenderly manner. â€Å"Some nights better, the lost body over me†¦Ã¢â‚¬  suggests that she misses her lost lover enormously. She fantasizes about herself enjoying her time with her ex-lover, but it did not last long as when she finally regain her conscience, her hatred and ager returns, evidently by â€Å"then down till I suddenly bite awake.† This stanza truly reaches out to me because I can feel that deep inside she tries to recover the wonderful memories they may have had together but she eventually decides to ignore it as she still had that tinge of anger inside her that she cannot let go. The last stanza is mainly talking about how her rage and abhorrence restores. It is somewhat similar to the first stanza, but she seems more furious in the last stanza. Thinking of how she actually â€Å"stabbed a wedding cake† shows us that she is plotting a huge revenge on a â€Å"male corpse† which we all postulate that it is her lover. This stanza makes me feel a little frightened by her as her attitude is rather alarming. Overall, I really do sympathized Miss Havisham deeply, but I do criticise some of her actions. For instance, I do not like the fact that she wants to inflict pain on others just because it is for her own sake. By praying for someone to die and planning a revenge on someone is not the right way to solve a problem. However, I do greatly pity her because of the phase she is going through. It is not easy getting over someone. In conclusion, the poet wants us all to sympathize her greatly, but only to a certain extent. We commiserate her for her peculiarity and her self-indulgence, but her sullenness and vindictiveness make us feel that she is a very vivacious and debauched person.

Thursday, August 29, 2019

Being and Becoming: Becoming by Being Essay

Pre-Socratic era was marked by periods from Thales of Miletus until that period when Socrates philosophy was yet to be born.   It was during this period when the fundamentals of science (both natural and social were being founded using scientific research and inquiry, and where philosophy and practical science were still married.   Sophos tried to understand and explain the origin, nature, elements, development and workings of the universe by way of argumentative reasoning, critical inquiry and justifications. Pre-Socratic philosophy was mainly characterized by its elements such as essence, change/absolute, harmony and its effort to understand the essential substance of a thing that caused its existence and the dynamic movement it undergoes (changes) to be known as what it is today.   Among the famous thinkers of this period were Thales, Heraclitus, Anaxagoras, Empedocles, Democritus and Parmenides. Whereas pre-Socratic philosophers have had formulated a common line of thought, disparity was inevitable.   Among the arguments that showed differences of the philosophy of that period was Parmenides’ theory of Being and Heraclitus’ theory of Becoming.   Heraclitus argued that the existence of everything was brought about by nothing and that it continuously exists through constant change or by undergoing a dynamic transformation.   What is more striking about the concept of change for Heraclitus is the concept of change within. According to him, it is that contradiction of elements/substance within the object that caused it to transform and that to cause its change, an external intervening need not to be imposed.   Accordingly, for Heraclitus, the world is a continuous struggle and strife, hence it needs change. In contrast, while the internal aspect of an existing element undergoes alterations, the process by which an element transforms is ever constant. Through the understanding of the nature of an element, Heraclitus recognized that the fixed states of being are all part of the varied state of perpetual becoming1.   In humans, the processes of giving birth, living, dying and rebirth are all changes that a person passes through.   However, such pattern is a never-ending cycle, after all.   What will â€Å"become† of a matter is a product of the dynamic development it subjects itself through a never ceasing rhythm. Contrary to Heraclitus’, while Parmenides likewise argued that an object exists because it does exist (that no other factor that may explain the causality of its existence), he failed to recognize if it ever underwent an evolutionary state, thus making it the â€Å"being† as it is today.   Because Parmenides believed – and apparently refuted Heraclitus – that the universe was already at the state of stability, why should it be basking on the process of modification? Everything is what it is because it is what it is and it cannot become what it is not. Both arguments are of much interest specifically in understanding how do we â€Å"become† or what make us came to â€Å"being†.   Later on during Plato’s time, both arguments could be reconciled by proposing that, what might â€Å"become† is caused by a â€Å"being†. However, unlike the foregoing arguments of Heraclitus’ and Parmenides, it is apparent that the reconciliation of the arguments were based on the thought that, indeed, there is a â€Å"first cause† that is never changing but rather causes the â€Å"second being† to become what it is today.   Note that both the initially mentioned thinkers do not believe on something that might have caused on object to exist. What could be more difficult in the understanding of this discourse is the process of analyzing concepts that flourished centuries apart and merging them into one critical explanation such that conflict resolution could be gained. References: __________. Philosophy Pages. In Britannica Internet Guide Selection. Retrieved April 11, 2008, from http://www.philosophypages.com/dy/p.htm#parm __________. (April 16, 2002). Pre-Socratic Era. Posted to http://everything2.com/index.pl?node_id=628825 Ballantyne, Paul F., Ph.D. History and Theory of Psychology: An Early 21st Century Student’s Perspective. (2008). Retrieved April 10, 2008, from www.comnet.ca/~pballan/section1(210).htm Goodman, Len E. (1992). Avicenna:Arabic Thought and Culture (pp 53-54). Routledge. Retrieved April 12, 2008, from http://books.google.com.ph/books?id=VJ6x-pcqMicC&pg=PA51&lpg=PA51&dq=resolving+the+argument+of+being+and+becoming&source=web&ots=gctA47HxTQ&sig=R0YNJ23QzZlvTpaLA5XclFgdKfY&hl=en#PPR5,M1 Rose, Jake. Being and Becoming. In Ezine Articles. Retrieved April 11, 2008, from http://ezinearticles.com/?Being-and-Becoming&id=148729   

Wednesday, August 28, 2019

Analysis of Business Operations Case Study Example | Topics and Well Written Essays - 500 words

Analysis of Business Operations - Case Study Example The management tool of my choice will be the affinity diagram when it comes to applying one of them in my business which deals with the sales and marketing of various products. The use of the affinity diagram greatly requires the utilization of team efforts. It therefore requires the complete attention of the whole team that is operating in the business. The importance of utilizing the diagram in my business arises from the fact that there is a lot of information generated. The team therefore requires to sort through this information to come up with the most effective measures of increasing a products sale. It is also applicable since the answers required are not just obvious to all the team members working on a presented problem. The solutions that are normally adopted come from the general consensus reached by all the team members. The tool is vital since it helps in establishing connections that were previously invisible between the information collected (Hutchins 56). It also greatly assists in the brainstorming of the causes and the solutions to various problems being experienced especially in situations where there is little information availed. The business benefits in a variety of ways through the use of the affinity diagram. This is because the diagram assists in making breakthroughs in various problems occurring and it also enables the establishment of greater teamwork activities. The diagram additionally helps in revealing relationships between various pieces of information and building of greater skills of critical thinking within my business team. The creation of these skills within the business greatly assists in solving the problems that our clients forward to us through the development of the most cost effective along with efficient solutions. The use of the affinity diagram has enabled the team members within the business to develop better communication skills when dealing with any problems that are brought to the company. This move

Tuesday, August 27, 2019

A Comparison of Different Software Methods Thesis

A Comparison of Different Software Methods - Thesis Example Businesses have grown over the years and web is an integral component of the business world. The global industry has seen the advantages of managing business, reaching out to customers in an interactive mode and selling products and services online through e-Commerce. Once the business logic is set, the web applications play a pivotal role in business decision making. For long the IT industry is building tools and testing has always been an integral part of the software development life cycle. Ralph Grove specified that though the purpose of testing is to ensure that web applications work correctly but the practical side is to identify the errors in it (Grove 218). Software development will have errors in it and identification and correction of the same is what is called debugging of the software. In earlier days, testing commenced only when the coding was complete but nowadays it is more an integral part and goes concurrently along with coding. Testing happens in various phases and planning needs to be done from the commencement of a web application. There needs to be prototypes developed as per the user specifications. The testers need to simulate the working environment of the users all across the globe; therefore compatibility plays an important role. They need to understand that there will not be any software or plug-in installed separately by the users and should be readily available for deployment of the web application. In this thesis, there will be five web application testing methods which shall be defined and elaborated. The theory will be illustrated and compared with the practical web application. Out of these five three will be chosen. There will be real time web applications chosen from public websites where a detailed theoretical and practical comparison made and the application of the chosen testing methods. This will give an insight to the researchers as to the practical problems faced by the developers and users if the application is not prop erly tested. Care needs to be taken on various fronts where problems need to be identified beforehand and measures taken to ensure smooth deployment and gradual up-gradation of the web application software.