980 resultados para design of experiments


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of bay wide estimates of recreational harvest has been identified as a high priority by the Chesapeake Bay Scientific Advisory Committee (CBSAC) and by the Chesapeake Bay Program as reflected in the Chesapeake Bay Blue Crab Fishery Management Plan (Chesapeake Bay Program 1996). In addition, the BiState Blue Crab Commission (BBCAC), formed in 1996 by mandate from the legislatures of Maryland and Virginia to advise on crab management, has also recognized the importance of estimating the levels and trends in catches in the recreational fishery. Recently, the BBCAC has adopted limit and target biological reference points. These analyses have been predicated on assumptions regarding the relative magnitude of the recreational and commercial catch. The reference points depend on determination of the total number of crabs removed from the population. In essence, the number removed by the various fishery sectors, represents a minimum estimate of the population size. If a major fishery sector is not represented, the total population will be accordingly underestimated. If the relative contribution of the unrepresented sector is constant over time and harvests the same components of the population as the other sectors, it may be argued that the population estimate derived from the other sectors is biased but still adequately represents trends in population size over time. If either of the two constraints mentioned above is not met, the validity of relative trends over time is suspect. With the recent increases in the human population in the Chesapeake Bay watershed, there is reason to be concerned that the recreational catch may not have been a constant proportion of the total harvest over time. It is important to assess the catch characteristics and the magnitude of the recreational fishery to evaluate this potential bias. (PDF contains 70 pages)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the past many different methodologies have been devised to support software development and different sets of methodologies have been developed to support the analysis of software artefacts. We have identified this mismatch as one of the causes of the poor reliability of embedded systems software. The issue with software development styles is that they are ``analysis-agnostic.'' They do not try to structure the code in a way that lends itself to analysis. The analysis is usually applied post-mortem after the software was developed and it requires a large amount of effort. The issue with software analysis methodologies is that they do not exploit available information about the system being analyzed.

In this thesis we address the above issues by developing a new methodology, called "analysis-aware" design, that links software development styles with the capabilities of analysis tools. This methodology forms the basis of a framework for interactive software development. The framework consists of an executable specification language and a set of analysis tools based on static analysis, testing, and model checking. The language enforces an analysis-friendly code structure and offers primitives that allow users to implement their own testers and model checkers directly in the language. We introduce a new approach to static analysis that takes advantage of the capabilities of a rule-based engine. We have applied the analysis-aware methodology to the development of a smart home application.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A general framework for multi-criteria optimal design is presented which is well-suited for automated design of structural systems. A systematic computer-aided optimal design decision process is developed which allows the designer to rapidly evaluate and improve a proposed design by taking into account the major factors of interest related to different aspects such as design, construction, and operation.

The proposed optimal design process requires the selection of the most promising choice of design parameters taken from a large design space, based on an evaluation using specified criteria. The design parameters specify a particular design, and so they relate to member sizes, structural configuration, etc. The evaluation of the design uses performance parameters which may include structural response parameters, risks due to uncertain loads and modeling errors, construction and operating costs, etc. Preference functions are used to implement the design criteria in a "soft" form. These preference functions give a measure of the degree of satisfaction of each design criterion. The overall evaluation measure for a design is built up from the individual measures for each criterion through a preference combination rule. The goal of the optimal design process is to obtain a design that has the highest overall evaluation measure - an optimization problem.

Genetic algorithms are stochastic optimization methods that are based on evolutionary theory. They provide the exploration power necessary to explore high-dimensional search spaces to seek these optimal solutions. Two special genetic algorithms, hGA and vGA, are presented here for continuous and discrete optimization problems, respectively.

The methodology is demonstrated with several examples involving the design of truss and frame systems. These examples are solved by using the proposed hGA and vGA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A two-mode adjustable superresolving filter based on a birefringent filter is proposed. This kind of filter has superresolution in two modes of adjustment. One is rotation of the binary pupil filter on the optical axis of the system and the other is the tilt of the filter away from the pupil plane on axis parallel or perpendicular to the optical axis of the crystal. The filters act as complex amplitude filters in the former mode, and as pure phase filters in the latter. By analyzing two superresolving parameters, we obtain the optimal design parameters that ensure a large field of view, a large superresolving range, and a high setting accuracy. This kind of filter can provide more flexibility in practical applications. (c) 2006 Optical Society of America.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computational protein design (CPD) is a burgeoning field that uses a physical-chemical or knowledge-based scoring function to create protein variants with new or improved properties. This exciting approach has recently been used to generate proteins with entirely new functions, ones that are not observed in naturally occurring proteins. For example, several enzymes were designed to catalyze reactions that are not in the repertoire of any known natural enzyme. In these designs, novel catalytic activity was built de novo (from scratch) into a previously inert protein scaffold. In addition to de novo enzyme design, the computational design of protein-protein interactions can also be used to create novel functionality, such as neutralization of influenza. Our goal here was to design a protein that can self-assemble with DNA into nanowires. We used computational tools to homodimerize a transcription factor that binds a specific sequence of double-stranded DNA. We arranged the protein-protein and protein-DNA binding sites so that the self-assembly could occur in a linear fashion to generate nanowires. Upon mixing our designed protein homodimer with the double-stranded DNA, the molecules immediately self-assembled into nanowires. This nanowire topology was confirmed using atomic force microscopy. Co-crystal structure showed that the nanowire is assembled via the desired interactions. To the best of our knowledge, this is the first example of a protein-DNA self-assembly that does not rely on covalent interactions. We anticipate that this new material will stimulate further interest in the development of advanced biomaterials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The prospect of terawatt-scale electricity generation using a photovoltaic (PV) device places strict requirements on the active semiconductor optoelectronic properties and elemental abundance. After reviewing the constraints placed on an "earth-abundant" solar absorber, we find zinc phosphide (α-Zn3P2) to be an ideal candidate. In addition to its near-optimal direct band gap of 1.5 eV, high visible-light absorption coefficient (>104 cm-1), and long minority-carrier diffusion length (>5 μm), Zn3P2 is composed of abundant Zn and P elements and has excellent physical properties for scalable thin-film deposition. However, to date, a Zn3P2 device of sufficient efficiency for commercial applications has not been demonstrated. Record efficiencies of 6.0% for multicrystalline and 4.3% for thin-film cells have been reported, respectively. Performance has been limited by the intrinsic p-type conductivity of Zn3P2 which restricts us to Schottky and heterojunction device designs. Due to our poor understanding of Zn3P2 interfaces, an ideal heterojunction partner has not yet been found.

The goal of this thesis is to explore the upper limit of solar conversion efficiency achievable with a Zn3P2 absorber through the design of an optimal heterojunction PV device. To do so, we investigate three key aspects of material growth, interface energetics, and device design. First, the growth of Zn3P2 on GaAs(001) is studied using compound-source molecular-beam epitaxy (MBE). We successfully demonstrate the pseudomorphic growth of Zn3P2 epilayers of controlled orientation and optoelectronic properties. Next, the energy-band alignments of epitaxial Zn3P2 and II-VI and III-V semiconductor interfaces are measured via high-resolution x-ray photoelectron spectroscopy in order to determine the most appropriate heterojunction partner. From this work, we identify ZnSe as a nearly ideal n-type emitter for a Zn3P2 PV device. Finally, various II-VI/Zn3P2 heterojunction solar cells designs are fabricated, including substrate and superstrate architectures, and evaluated based on their solar conversion efficiency.