811 resultados para Architecture and Complexity


Relevância:

90.00% 90.00%

Publicador:

Resumo:

National Science Foundation Grant no. MEA-8310520 and Office of Naval Research Grant no. N00014-83-G-0066.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Caption in Michigan Alumnus: These three comely coeds are now known as "tradition topplers" on the Michigan campus. They are the first to study naval architecture, a course for years taken only by males. Left to right are: Darien Pinney, Judy Robinson, and Susan Ott.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The international perspectives on these issues are especially valuable in an increasingly connected, but still institutionally and administratively diverse world. The research addressed in several chapters in this volume includes issues around technical standards bodies like EpiDoc and the TEI, engaging with ways these standards are implemented, documented, taught, used in the process of transcribing and annotating texts, and used to generate publications and as the basis for advanced textual or corpus research. Other chapters focus on various aspects of philological research and content creation, including collaborative or community driven efforts, and the issues surrounding editorial oversight, curation, maintenance and sustainability of these resources. Research into the ancient languages and linguistics, in particular Greek, and the language teaching that is a staple of our discipline, are also discussed in several chapters, in particular for ways in which advanced research methods can lead into language technologies and vice versa and ways in which the skills around teaching can be used for public engagement, and vice versa. A common thread through much of the volume is the importance of open access publication or open source development and distribution of texts, materials, tools and standards, both because of the public good provided by such models (circulating materials often already paid for out of the public purse), and the ability to reach non-standard audiences, those who cannot access rich university libraries or afford expensive print volumes. Linked Open Data is another technology that results in wide and free distribution of structured information both within and outside academic circles, and several chapters present academic work that includes ontologies and RDF, either as a direct research output or as essential part of the communication and knowledge representation. Several chapters focus not on the literary and philological side of classics, but on the study of cultural heritage, archaeology, and the material supports on which original textual and artistic material are engraved or otherwise inscribed, addressing both the capture and analysis of artefacts in both 2D and 3D, the representation of data through archaeological standards, and the importance of sharing information and expertise between the several domains both within and without academia that study, record and conserve ancient objects. Almost without exception, the authors reflect on the issues of interdisciplinarity and collaboration, the relationship between their research practice and teaching and/or communication with a wider public, and the importance of the role of the academic researcher in contemporary society and in the context of cutting edge technologies. How research is communicated in a world of instant- access blogging and 140-character micromessaging, and how our expectations of the media affect not only how we publish but how we conduct our research, are questions about which all scholars need to be aware and self-critical.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-06

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A specialised reconfigurable architecture is targeted at wireless base-band processing. It is built to cater for multiple wireless standards. It has lower power consumption than the processor-based solution. It can be scaled to run in parallel for processing multiple channels. Test resources are embedded on the architecture and testing strategies are included. This architecture is functionally partitioned according to the common operations found in wireless standards, such as CRC error correction, convolution and interleaving. These modules are linked via Virtual Wire Hardware modules and route-through switch matrices. Data can be processed in any order through this interconnect structure. Virtual Wire ensures the same flexibility as normal interconnects, but the area occupied and the number of switches needed is reduced. The testing algorithm scans all possible paths within the interconnection network exhaustively and searches for faults in the processing modules. The testing algorithm starts by scanning the externally addressable memory space and testing the master controller. The controller then tests every switch in the route-through switch matrix by making loops from the shared memory to each of the switches. The local switch matrix is also tested in the same way. Next the local memory is scanned. Finally, pre-defined test vectors are loaded into local memory to check the processing modules. This paper compares various base-band processing solutions. It describes the proposed platform and its implementation. It outlines the test resources and algorithm. It concludes with the mapping of Bluetooth and GSM base-band onto the platform.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Many growing networks possess accelerating statistics where the number of links added with each new node is an increasing function of network size so the total number of links increases faster than linearly with network size. In particular, biological networks can display a quadratic growth in regulator number with genome size even while remaining sparsely connected. These features are mutually incompatible in standard treatments of network theory which typically require that every new network node possesses at least one connection. To model sparsely connected networks, we generalize existing approaches and add each new node with a probabilistic number of links to generate either accelerating, hyperaccelerating, or even decelerating network statistics in different regimes. Under preferential attachment for example, slowly accelerating networks display stationary scale-free statistics relatively independent of network size while more rapidly accelerating networks display a transition from scale-free to exponential statistics with network growth. Such transitions explain, for instance, the evolutionary record of single-celled organisms which display strict size and complexity limits.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study compares process data with microscopic observations from an anaerobic digestion of organic particles. As the first part of the study, this article presents detailed observations of microbial biofilm architecture and structure in a 1.25-L batch digester where all particles are of an equal age. Microcrystalline cellulose was used as the sole carbon and energy source. The digestions were inoculated with either leachate from a 220-Lanaerobic municipal solid waste digester or strained rumen contents from a fistulated cow. The hydrolysis rate, when normalized by the amount of cellulose remaining in the reactor, was found to reach a constant value 1 day after inoculation with rumen fluid, and 3 days after inoculating with digester leachate. A constant value of a mass specific hydrolysis rate is argued to represent full colonization of the cellulose surface and first-order kinetics only apply after this point. Additionally, the first-order hydrolysis rate constant, once surfaces were saturated with biofilm, was found to be two times higher with a rumen inoculum, compared to a digester leachate inoculum. Images generated by fluorescence in situ hybridization (FISH) probing and confocal laser scanning microscopy show that the microbial communities involved in the anaerobic biodegradation process exist entirely within the biofilm. For the reactor conditions used in these experiments, the predominant methanogens exist in ball-shaped colonies within the biofilm. (C) 2005 Wiley Periodicals, Inc.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The history of human experimentation in the twelve years between Hitler's rise to power and the end of the Second World War is notorious in the annals of the twen- tieth century. The horrific experiments conducted at Dachau, Auschwitz, Ravens- brueck, Birkenau, and other National Socialist concentration camps reflected an extreme indifference to human life and human suffering. Unfortunately, they do not reflect the extent and complexity of the human experiments undertaken in the years between 1933 and 1945. Following the prosecution of twenty-three high-ranking National Socialist physicians and medical administrators for war crimes and crimes against humanity in the Nuremberg Medical Trial (United States v. Karl Brandt et al.), scholars have rightly focused attention on the nightmarish researches con- ducted by a small group of investigators on concentration camp inmates. Less well known are alternative pathways that brought investigators to undertake human ex- perimentation in other laboratories, settings, and nations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Based on the three-dimensional elastic inclusion model proposed by Dobrovolskii, we developed a rheological inclusion model to study earthquake preparation processes. By using the Corresponding Principle in the theory of rheologic mechanics, we derived the analytic expressions of viscoelastic displacement U(r, t) , V(r, t) and W(r, t), normal strains epsilon(xx) (r, t), epsilon(yy) (r, t) and epsilon(zz) (r, t) and the bulk strain theta (r, t) at an arbitrary point (x, y, z) in three directions of X axis, Y axis and Z axis produced by a three-dimensional inclusion in the semi-infinite rheologic medium defined by the standard linear rheologic model. Subsequent to the spatial-temporal variation of bulk strain being computed on the ground produced by such a spherical rheologic inclusion, interesting results are obtained, suggesting that the bulk strain produced by a hard inclusion change with time according to three stages (alpha, beta, gamma) with different characteristics, similar to that of geodetic deformation observations, but different with the results of a soft inclusion. These theoretical results can be used to explain the characteristics of spatial-temporal evolution, patterns, quadrant-distribution of earthquake precursors, the changeability, spontaneity and complexity of short-term and imminent-term precursors. It offers a theoretical base to build physical models for earthquake precursors and to predict the earthquakes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Fires are integral to the healthy functioning of most ecosystems and are often poorly understood in policy and management, however, the relationship between floristic composition and habitat structure is intrinsically linked, particularly after fire. The aim of this study was to test whether the variability of habitat structure or floristic composition and abundance in forests at a regional scale can be explained in terms of fire frequency using historical data and experimental prescribed burns. We tested this hypothesis in open eucalypt forests of Fraser Island off the east coast of Australia. Fraser Island dunes show progressive stages in plant succession as access to nutrients decreases across the Island. We found that fire frequency was not a good predictor of floristic composition or abundance across dune systems; rather, its affects were dune specific. In contrast, habitat structure was strongly influenced by fire frequency, independent of dune system. A dense understorey occurred in frequently burnt areas, whereas infrequently burnt areas had a more even distribution of plant heights. Plant communities returned to pre-burn levels of composition and abundances within 6 months of a fire and frequently burnt areas were dominated by early successional species of plant. These ecosystems were characterized by low diversity and frequently burnt areas on the east coast were dominated by Pteridium. Greater midstorey canopy cover in low frequency areas reduces light penetration and allows other species to compete more effectively with Pteridium. Our results strongly indicate that frequent fires on the Island have resulted in a decrease in relative diversity through dominance of several species. Prescribed fire represents a powerful management tool to shape habitat structure and complexity of Fraser Island forests.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The LCST transitions of novel N-isopropylacrylamide ( NIPAM) star polymers, prepared using the four-armed RAFT agent pentaerythritoltetrakis(3-(S-benzyltrithiocarbonyl) propionate) (PTBTP) and their hydrolyzed linear arms were studied using H-1 NMR, PFG-NMR, and DLS. The aim was to determine the effect of polymer architecture and the presence of end groups derived from RAFT agents on the LCST. The LCST transitions of star PNIPAM were significantly depressed by the presence of the hydrophobic star core and possibly the benzyl end groups. The effect was molecular weight dependent and diminished once the number of repeating units per arm >= 70. The linear PNIPAM exhibited an LCST of 35 degrees C, regardless of molecular weight; the presence of both hydrophilic and hydrophobic end groups after hydrolysis from the star core was suggested to cancel effects on the LCST. A significant decrease in R-H was observed below the LCST for star and linear PNIPAM and was attributed to the formation of n-clusters. Application of a scaling law to the linear PNIPAM data indicated the cluster size n = 6. Tethering to the hydrophobic star core appeared to inhibit n-cluster formation in the lowest molecular weight stars; this may be due to enhanced stretching of the polymer chains, or the presence of larger numbers of n-clusters at temperatures below those measured.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

SQL (Structured Query Language) is one of the essential topics in foundation databases courses in higher education. Due to its apparent simple syntax, learning to use the full power of SQL can be a very difficult activity. In this paper, we introduce SQLator, which is a web-based interactive tool for learning SQL. SQLator's key function is the evaluate function, which allows a user to evaluate the correctness of his/her query formulation. The evaluate engine is based on complex heuristic algorithms. The tool also provides instructors the facility to create and populate database schemas with an associated pool of SQL queries. Currently it hosts two databases with a query pool of 300+ across the two databases. The pool is divided into 3 categories according to query complexity. The SQLator user can perform unlimited executions and evaluations on query formulations and/or view the solutions. The SQLator evaluate function has a high rate of success in evaluating the user's statement as correct (or incorrect) corresponding to the question. We will present in this paper, the basic architecture and functions of SQLator. We will further discuss the value of SQLator as an educational technology and report on educational outcomes based on studies conducted at the School of Information Technology and Electrical Engineering, The University of Queensland.