This thesis addresses a problem at the nexus of engineering, computer science, and economics: in large scale, decentralized systems, how can we efficiently allocate scarce resources among competing interests? On one hand, constraints are imposed on the system designer by the inherent architecture of any large scale system. These constraints are counterbalanced by the need to design mechanisms that efficiently allocate resources, even when the system is being used by participants who have only their own best interests at stake. We consider the design of resource allocation mechanisms in such environments. The analytic approach we pursue is characterized by four salient features. First, the monetary value of resource allocation is measured by the aggregate surplus (aggregate utility less aggregate cost) achieved at a given allocation. An efficient allocation is one which maximizes aggregate surplus. Second, we focus on market-clearing mechanisms, which set a single price to ensure demand equals supply. Third, all the mechanisms we consider ensure a fully efficient allocation if market participants do not anticipate the effects of their actions on market-clearing prices. Finally, when market participants are price anticipating, full efficiency is generally not achieved...
Advances in hardware design and manufacturing often lead to new ways in which problems can be solved computationally. In this thesis we explore fundamental problems in three computational models that are based on such recent advances. The first model is based on new chip architectures, where multiple independent processing units are placed on one chip, allowing for an unprecedented parallelism in hardware. We provide new scheduling algorithms for this computational model. The second model is motivated by peer-to-peer networks, where countless (often inexpensive) computing devices cooperate in distributed applications without any central control. We state and analyze new algorithms for load balancing and for locality-aware distributed data storage in peer-to-peer networks. The last model is based on extensions of the streaming model. It is an attempt to capture the class of problems that can be efficiently solved on massive data sets. We give a number of algorithms for this model, and compare it to other models that have been proposed for massive data set computations. Our algorithms and complexity results for these computational models follow the central thesis that it is an important part of theoretical computer science to model real-world computational structures...
by James William Stamos.; Thesis (M.S.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1982.; MICROFICHE COPY AVAILABLE IN ARCHIVES AND ENGINEERING; Bibliography: leaves 128-131.
CiteSeer is a well-known online resource for the computer science research community, allowing users to search and browse a large archive of research papers. Unfortunately, its current centralized incarnation is costly to run. Although members of the community would presumably be willing to donate hardware and bandwidth at their own sites to assist CiteSeer, the current architecture does not facilitate such distribution of resources. OverCite is a design for a new architecture for a distributed and cooperative research library based on a distributed hash table (DHT). The new architecture harnesses donated resources at many sites to provide document search and retrieval service to researchers worldwide. A preliminary evaluation of an initial OverCite prototype shows that it can service more queries per second than a centralized system, and that it increases total storage capacity by a factor of n/4 in a system of n nodes. OverCite can exploit these additional resources by supporting new features such as document alerts, and by scaling to larger data sets.; by Jeremy Stribling.; Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2005.; Includes bibliographical references (leaves 47-50).
The iLab Heat Transfer Project website started four years ago to enable web access to experiments related to movement of heat through transport processes. This thesis details improvements made to the site which extend and enhance the site prior to the project. Software improvements include giving teaching assistants the ability to add their entire class as users simultaneously and creating a method by which feedback data is stored as a full questionnaire instead of database entries. Hardware improvements include the addition of a webcam that streams video and audio of the experiment in real time and the integration of two new thermodynamic experiments complete with remote access. The final improvement is the administrator manual, which is intended to ease the burden on new staff members by bridging their knowledge with that of previous years.; by David P. Saylor.; Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, September 2005; and, (S.B.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, June 2004.; "September 5, 2005." "Copyright 1998."; Includes bibliographical references (p. 74).
This thesis describes a course scheduling system that models planning as a satisfiability problem in relational logic. Given a set of course requirements for a degree program, our system can find a schedule of courses that will complete these requirements. It supports a flexible XML format for expressing course requirements and also handles additional user-specific constraints, such as requirements that certain courses be taken at particular times. Various optimizations were included in the translation to relational logic to improve the performance of our system and the quality of its results. We ran experiments on our system using degree programs from the Department of Electrical Engineering and Computer Science at MIT as input, and found that our approach is competitive with conventional planners.; by Vincent S. Yeung.; Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2006.; Includes bibliographical references (p. 83-84).
The advantages of delivering injections via needle-free methods are numerous. However, conventional methods for needle-free injection lack sufficient control over depth of penetration and shape of injection. Thus, a needle-free injector was designed, constructed, and tested, using a controllable linear Lorentz-force actuator. This actuator allows rapid control of the injection pressure during injections. Using this device, precise control over delivery parameters can be achieved. In addition, several portable power systems for this injector were developed, allowing the energy-intensive needle-free injector to be used in the field. The injector design was tested for repeatability and use for both in-vitro and in-vivo testing on murine tissue using a bacterial collagenase.; by Brian D. Hemond.; Thesis (S.B.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, June 2004; and, (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, February 2006.; Includes bibliographical references (leaves 89-90).
by Steven Robert Shaw.; Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science; and, (Elec. E.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2000.; Earlier issued with only one degree specified: M.Eng. M. Eng. degree awarded in 1997; Elec. E. degree awarded in 2000.; Includes bibliographical references (leaves 221-223).
Fonte: Escola de Pós-Graduação NavalPublicador: Escola de Pós-Graduação Naval
Relevância na Pesquisa
Department of Computer Science at the Naval Postgraduate School webpage.; Educating our Computer Science masters and doctoral students is the most important activity of the department. Our central focus is
equipping our students to perform well as technical leaders in the world they will face after graduation. Complexity and change are
dominant characteristics of that world. We develop technical leaders by teaching a principles-based curriculum, around which we build
practices for managing complexity and innovation. We conduct an extensive research program that directly enhances national security
by increasing the effectiveness of the armed forces of the United States and its allies.
A trade-off between linearity and efficiency exists in conventional power amplifiers. The outphase amplifying concept overcomes this trade-off by enabling the use of high efficiency, non-linear power amplifiers for linear amplification. An amplitude modulated signal is first decomposed into two constant amplitude, phase modulated signals that can be amplified using two high efficiency switching power amplifiers. The two outputs are then recombined to restore the original amplitude modulated signal. In this manner, an outphase power amplifier can simultaneously achieve high efficiency and good linearity. This thesis investigates the capability of the outphase amplifying technique in modern wireless communication. First, a digital amplitude-to-phase conversion scheme is proposed to facilitate the outphase decomposition. By taking advantage of the available computational power in current digital technology, the amplitude-to-phase conversion can be implemented with both accuracy and efficiency in the digital domain. A proof-of-concept outphase power amplifier is fabricated using the IBM 7WL SiGe BiCMOS process technology.; (cont.) The test chip includes two class-E power amplifiers and the first 5.8GHz fully integrated Wilkinson power combiner. The low-loss integrated combiner allows efficient outphase recombining while providing the necessary input isolation for a robust outphase power amplifier. The outphase power amplifier achieves an efficiency of 47% at the maximum output power of 18.5 dBm. For an input Orthogonal Frequency Division Multiplexing (OFDM) signal of 32 sub-channels of 64-QAM...
Using concepts from computer science and mathematics I develop three algorithms to find the minimum integer weights for voting games. Games with up to at least 17 players can be solved in a reasonable amount of time. First, coalitions are mapped to constraints, reducing the problem to constraint optimization. The optimization techniques used are Gomory's all-integer simplex algorithm and a variant of the popular integer programming method branch and bound. Theoretical results include that minimum integer weights are not unique and a confirmation of a prior result that minimum integer weights are proportional to a priori seat share. Thus, these algorithms can be useful for researchers evaluating the differences between proportional bargaining models and formateur models. The running times of the different algorithms are contrasted and analyzed for potential improvements.; by Aaron B. Strauss.; Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2003.; Includes bibliographical references (p. 73-76).
Personalization capabilities in computer applications attempt to better meet the needs of individuals. The more traditional and widespread paradigm in application design is that the user should adapt to the available application. This requires that the individual user's task be sliced and molded to fit the dimensions offered by an inflexible, monolithic application. It is desirable to have an application that can be shaped to fit each individual user's dynamic needs. However, it is important that this is done in an intuitive and unobtrusive way. In this thesis, we design and evaluate a personalizable application developed to aid life science researchers in their work. We designed the application in Haystack, a platform for developing semantic applications and user interfaces. The application gave the user flexibility in personalizing the way in which information is organized and displayed, while giving users access to the tools necessary to perform their tasks. We selected researchers as the user group to focus on because of the inherent necessity in their work for originality and dynamic adaptation. Life sciences research was chosen as the domain due to its potential to benefit from the application of semantic technologies. We tested how users reacted and adapted to this application by conducting a formal user study.; by Sumudu Weerakoon Watugala.; Thesis (M. Eng.)--Massachusetts Institute of Technology...
We describe a DNA computing system called programmed mutagenesis. prove that it is universal, and present experimental results from a prototype computation. DNA is a material with important characteristics, such as possessing all the information necessary for self-reproduction in the presence of appropriate enzymes and components, simple natural evolution mechanism, and miniature scale, all of which make it an attractive substrate for computation. For computer science, using single DNA molecules to represent the state of a computation holds the promise of a new paradigm of composable molecular computing. For biology, the demonstration that DNA sequences could guide their own evolution under computational rules may have implications as we begin to unravel the mysteries of genome encoding. Programmed mutagenesis is a DNA computing system that uses cycles of DNA annealing, ligation, and polymerization to implement programmatic rewriting of DNA sequences. We report that programmed mutagenesis is theoretically universal by showing how Minsky's 4-symbol 7-state Universal Turing Machine can be implemented using a programmed mutagenesis system. Each step of the Universal Turing Machine is implemented by four cycles of programmed mutagenesis...
This paper sets out to examine the skills gaps between the industrial
application of Information Technology and university academic programmes
(curriculum). It looks at some of the causes, and considers the probable
solutions for bridging the gap between them and suggests the possibilities of
exploring a new role for our universities and employers of labor. It also
highlights strategies to abolish the misalignment between university and
industry. The main concept is to blend the academic rigidity with the
industrial relevance.; Comment: 10 pages IEEE Format, International Journal of Computer Science and
Information Security, IJCSIS 2009, ISSN 1947 5500, Impact factor 0.423,
At the very beginning of compiling a bibliography, usually only basic
information, such as title, authors and publication date of an item are known.
In order to gather additional information about a specific item, one typically
has to search the library catalog or use a web search engine. This look-up
procedure implies a manual effort for every single item of a bibliography. In
this technical report we present a proof of concept which utilizes Linked Data
technology for the simple enrichment of sparse metadata sets. This is done by
discovering owl:sameAs links be- tween an initial set of computer science
papers and resources from external data sources like DBLP, ACM and the Semantic
Web Conference Corpus. In this report, we demonstrate how the link discovery
tool Silk is used to detect additional information and to enrich an initial set
of records in the computer science domain. The pros and cons of silk as link
discovery tool are summarized in the end.; Comment: 22 pages, 4 figures, 7 listings, presented at SWIB12
We describe a practical approach for visual exploration of research papers.
Specifically, we use the titles of papers from the DBLP database to create what
we call maps of computer science (MoCS). Words and phrases from the paper
titles are the cities in the map, and countries are created based on word and
phrase similarity, calculated using co-occurrence. With the help of heatmaps,
we can visualize the profile of a particular conference or journal over the
base map. Similarly, heatmap profiles can be made of individual researchers or
groups such as a department. The visualization system also makes it possible to
change the data used to generate the base map. For example, a specific journal
or conference can be used to generate the base map and then the heatmap
overlays can be used to show the evolution of research topics in the field over
the years. As before, individual researchers or research groups profiles can be
visualized using heatmap overlays but this time over the journal or conference
base map. Finally, research papers or abstracts easily generate visual
abstracts giving a visual representation of the distribution of topics in the
paper. We outline a modular and extensible system for term extraction using
natural language processing techniques...
This paper examines the difference and similarities between the two on-line
computer science citation databases DBLP and CiteSeer. The database entries in
DBLP are inserted manually while the CiteSeer entries are obtained autonomously
via a crawl of the Web and automatic processing of user submissions. CiteSeer's
autonomous citation database can be considered a form of self-selected on-line
survey. It is important to understand the limitations of such databases,
particularly when citation information is used to assess the performance of
authors, institutions and funding bodies.
We show that the CiteSeer database contains considerably fewer single author
papers. This bias can be modeled by an exponential process with intuitive
explanation. The model permits us to predict that the DBLP database covers
approximately 24% of the entire literature of Computer Science. CiteSeer is
also biased against low-cited papers.
Despite their difference, both databases exhibit similar and significantly
different citation distributions compared with previous analysis of the Physics
community. In both databases, we also observe that the number of authors per
paper has been increasing over time.; Comment: ECDL 2005
We describe the Computing Research Repository (CoRR), a new electronic
archive for rapid dissemination and archiving of computer science research
results. CoRR was initiated in September 1998 through the cooperation of ACM,
LANL (Los Alamos National Laboratory) e-Print archive, and NCSTRL (Networked
Computer Science Technical Research Library. Through its implementation of the
Dienst protocol, CoRR combines the open and extensible architecture of NCSTRL
with the reliable access and well-established management practices of the LANL
XXX e-Print repository. This architecture will allow integration with other
e-Print archives and provides a foundation for a future broad-based scholarly
digital library. We describe the decisions that were made in creating CoRR, the
architecture of the CoRR/NCSTRL interoperation, and issues that have arisen
during the operation of CoRR.; Comment: Submission to ACM DL99
E nosso propósito neste artigo expor aspectos relativos a Educaçáo Especial e discutir alguns princípios educacionais fundamentais no uso da informatica na Educação Especial, considemndo a irnportancia das novas tecnologias e a necessidade de se trabalhar com computadores dentro de uma pedagogia que priorize a reestruturação de conhecimentos e não o desenvolvimento de uma tecnologia de reproduçáo de informaçóes. O trabalho com Informática nos conduz a uma metodologia por meio da qual toda educação é um processo especial, e o educador deve apresentar um novo perfil perante os recursos tecnológioos e as novas tecnologias em comunicação.; It is our purpose in this article to present questions related to Special Education and to discuss some fundamental educational principles in the use of Computer Science within Special Education. Due to the importante of the new technologies and lhe need ot working just with oomputers in pedagogy the restructuring of knowledge and not, the development of a technology for the reproduction o? information need to be considered,.The work with Computer Science leads us to a methodology through which all education is seen as a special process, and the educator presents a new profile in the face of technological resouroes and of the new technologies in communication.
Abstract- The Central University of Venezuela, as part of its efforts for adapting its academic offer to national and international needs, is conducting a project to review, evaluate and modify the Computer Science curriculum, in order to form the professional required by the country. In this paper we present the results of the first stage of the project: an assessment of the Computer Science curriculum based on the use of various data collection instruments used to determine the professor, student and graduates perception of the program as well as deficiencies and potential of our graduates, according to the companies and organizations that hire them. The results show that there exist a gap between the perception of our professors about the program and the opinion of the employers. We also present information about student performance during the last decade, which is an input to the next stage of the program redesign.