853 resultados para 1094


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Quantitative optical spectroscopy has the potential to provide an effective low cost, and portable solution for cervical pre-cancer screening in resource-limited communities. However, clinical studies to validate the use of this technology in resource-limited settings require low power consumption and good quality control that is minimally influenced by the operator or variable environmental conditions in the field. The goal of this study was to evaluate the effects of two sources of potential error: calibration and pressure on the extraction of absorption and scattering properties of normal cervical tissues in a resource-limited setting in Leogane, Haiti. Our results show that self-calibrated measurements improved scattering measurements through real-time correction of system drift, in addition to minimizing the time required for post-calibration. Variations in pressure (tested without the potential confounding effects of calibration error) caused local changes in vasculature and scatterer density that significantly impacted the tissue absorption and scattering properties Future spectroscopic systems intended for clinical use, particularly where operator training is not viable and environmental conditions unpredictable, should incorporate a real-time self-calibration channel and collect diffuse reflectance spectra at a consistent pressure to maximize data integrity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The goal of this paper is to improve our understanding of the role of institutional arrangements and ecological factors that facilitate the emergence and sustainability of successful collective action in small-scale fishing social-ecological systems. Using a modified logistic growth function, we simulate how ecological factors (i.e. carrying capacity) affect small-scale fishing communities with varying degrees of institutional development (i.e. timeliness to adopt new institutions and the degree to which harvesting effort is reduced), in their ability to avoid overexploitation. Our results show that strong and timely institutions are necessary but not sufficient to maintain sustainable harvests over time. The sooner communities adopt institutions, and the stronger the institutions they adopt, the more likely they are to sustain the resource stock. Exactly how timely the institutions must be adopted, and by what amount harvesting effort must be diminished, depends on the ecological carrying capacity of the species at the particular location. Small differences in the carrying capacity between fishing sites, even under scenarios of similar institutional development, greatly affects the likelihood of effective collective action. © 2009 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Axisymmetric radiating and scattering structures whose rotational invariance is broken by non-axisymmetric excitations present an important class of problems in electromagnetics. For such problems, a cylindrical wave decomposition formalism can be used to efficiently obtain numerical solutions to the full-wave frequency-domain problem. Often, the far-field, or Fraunhofer region is of particular interest in scattering cross-section and radiation pattern calculations; yet, it is usually impractical to compute full-wave solutions for this region. Here, we propose a generalization of the Stratton-Chu far-field integral adapted for 2.5D formalism. The integration over a closed, axially symmetric surface is analytically reduced to a line integral on a meridional plane. We benchmark this computational technique by comparing it with analytical Mie solutions for a plasmonic nanoparticle, and apply it to the design of a three-dimensional polarization-insensitive cloak.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We use mechanical translation of a coded aperture for code division multiple access compression of video. We discuss the compressed video's temporal resolution and present experimental results for reconstructions of > 10 frames of temporal data per coded snapshot.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Strong coupling between a two-level system (TLS) and bosonic modes produces dramatic quantum optics effects. We consider a one-dimensional continuum of bosons coupled to a single localized TLS, a system which may be realized in a variety of plasmonic, photonic, or electronic contexts. We present the exact many-body scattering eigenstate obtained by imposing open boundary conditions. Multiphoton bound states appear in the scattering of two or more photons due to the coupling between the photons and the TLS. Such bound states are shown to have a large effect on scattering of both Fock- and coherent-state wave packets, especially in the intermediate coupling-strength regime. We compare the statistics of the transmitted light with a coherent state having the same mean photon number: as the interaction strength increases, the one-photon probability is suppressed rapidly, and the two- and three-photon probabilities are greatly enhanced due to the many-body bound states. This results in non-Poissonian light. © 2010 The American Physical Society.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multilevel algorithms are a successful class of optimization techniques that address the mesh partitioning problem for mapping meshes onto parallel computers. They usually combine a graph contraction algorithm together with a local optimization method that refines the partition at each graph level. To date, these algorithms have been used almost exclusively to minimize the cut-edge weight in the graph with the aim of minimizing the parallel communication overhead. However, it has been shown that for certain classes of problems, the convergence of the underlying solution algorithm is strongly influenced by the shape or aspect ratio of the subdomains. Therefore, in this paper, the authors modify the multilevel algorithms to optimize a cost function based on the aspect ratio. Several variants of the algorithms are tested and shown to provide excellent results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Realizing scalable performance on high performance computing systems is not straightforward for single-phenomenon codes (such as computational fluid dynamics [CFD]). This task is magnified considerably when the target software involves the interactions of a range of phenomena that have distinctive solution procedures involving different discretization methods. The problems of addressing the key issues of retaining data integrity and the ordering of the calculation procedures are significant. A strategy for parallelizing this multiphysics family of codes is described for software exploiting finite-volume discretization methods on unstructured meshes using iterative solution procedures. A mesh partitioning-based SPMD approach is used. However, since different variables use distinct discretization schemes, this means that distinct partitions are required; techniques for addressing this issue are described using the mesh-partitioning tool, JOSTLE. In this contribution, the strategy is tested for a variety of test cases under a wide range of conditions (e.g., problem size, number of processors, asynchronous / synchronous communications, etc.) using a variety of strategies for mapping the mesh partition onto the processor topology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the special case of the m machine flow shop problem in which the processing time of each operation of job j is equal to pj; this variant of the flow shop problem is known as the proportionate flow shop problem. We show that for any number of machines and for any regular performance criterion we can restrict our search for an optimal schedule to permutation schedules. Moreover, we show that the problem of minimizing total weighted completion time is solvable in O(n2) time. © 1998 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper considers a special class of flow-shop problems, known as the proportionate flow shop. In such a shop, each job flows through the machines in the same order and has equal processing times on the machines. The processing times of different jobs may be different. It is assumed that all operations of a job may be compressed by the same amount which will incur an additional cost. The objective is to minimize the makespan of the schedule together with a compression cost function which is non-decreasing with respect to the amount of compression. For a bicriterion problem of minimizing the makespan and a linear cost function, an O(n log n) algorithm is developed to construct the Pareto optimal set. For a single criterion problem, an O(n2) algorithm is developed to minimize the sum of the makespan and compression cost. Copyright © 1999 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the problem of scheduling independent jobs on two machines in an open shop, a job shop and a flow shop environment. Both machines are batching machines, which means that several operations can be combined into a batch and processed simultaneously on a machine. The batch processing time is the maximum processing time of operations in the batch, and all operations in a batch complete at the same time. Such a situation may occur, for instance, during the final testing stage of circuit board manufacturing, where burn-in operations are performed in ovens. We consider cases in which there is no restriction on the size of a batch on a machine, and in which a machine can process only a bounded number of operations in one batch. For most of the possible combinations of restrictions, we establish the complexity status of the problem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider a range of single machine and identical parallel machine pre-emptive scheduling models with controllable processing times. For each model we study a single criterion problem to minimize the compression cost of the processing times subject to the constraint that all due dates should be met. We demonstrate that each single criterion problem can be formulated in terms of minimizing a linear function over a polymatroid, and this justifies the greedy approach to its solution. A unified technique allows us to develop fast algorithms for solving both single criterion problems and bicriteria counterparts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A comprehensive solution of solidification/melting processes requires the simultaneous representation of free surface fluid flow, heat transfer, phase change, nonlinear solid mechanics and, possibly, electromagnetics together with their interactions, in what is now known as multiphysics simulation. Such simulations are computationally intensive and the implementation of solution strategies for multiphysics calculations must embed their effective parallelization. For some years, together with our collaborators, we have been involved in the development of numerical software tools for multiphysics modeling on parallel cluster systems. This research has involved a combination of algorithmic procedures, parallel strategies and tools, plus the design of a computational modeling software environment and its deployment in a range of real world applications. One output from this research is the three-dimensional parallel multiphysics code, PHYSICA. In this paper we report on an assessment of its parallel scalability on a range of increasingly complex models drawn from actual industrial problems, on three contemporary parallel cluster systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes the methodologies employed in the collection and storage of first-hand accounts of evacuation experiences derived from face-to-face interviews with evacuees from the World Trade Center (WTC) Twin Towers complex on 11 September 2001. In particular the paper describes the development of the High-rise Evacuation Evaluation Database (HEED). This is a flexible qualitative research tool which contains the full transcribed interview accounts and coded evacuee experiences extracted from those transcripts. The data and information captured and stored in the HEED database is not only unique, but it provides a means to address current and emerging issues relating to human factors associated with the evacuation of high-rise buildings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider various single machine scheduling problems in which the processing time of a job depends either on its position in a processing sequence or on its start time. We focus on problems of minimizing the makespan or the sum of (weighted) completion times of the jobs. In many situations we show that the objective function is priority-generating, and therefore the corresponding scheduling problem under series-parallel precedence constraints is polynomially solvable. In other situations we provide counter-examples that show that the objective function is not priority-generating.