915 resultados para Two-level scheduling and optimization


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mn, Fe, Ca, Co, Ni, Cu, Zn, Cd, Sn, Tl, Pb and Bi have been estimated in thirty-two nodules from the Pacific, Atlantic and Indian oceans. Various features about the composition of manganese nodules are discussed: element abundances, degrees of enrichment, inter-element relationships (notably between Ni and Cu, and between Zn and Cd), regional variations and some aspects of statistical distribution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper determines the capability of two photogrammetric systems in terms of their measurement uncertainty in an industrial context. The first system – V-STARS inca3 from Geodetic Systems Inc. – is a commercially available measurement solution. The second system comprises an off-the-shelf Nikon D700 digital camera fitted with a 28 mm Nikkor lens and the research-based Vision Measurement Software (VMS). The uncertainty estimate of these two systems is determined with reference to a calibrated constellation of points determined by a Leica AT401 laser tracker. The calibrated points have an average associated standard uncertainty of 12·4 μm, spanning a maximum distance of approximately 14·5 m. Subsequently, the two systems’ uncertainty was determined. V-STARS inca3 had an estimated standard uncertainty of 43·1 μm, thus outperforming its manufacturer's specification; the D700/VMS combination achieved a standard uncertainty of 187 μm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A tenet of modern radiotherapy (RT) is to identify the treatment target accurately, following which the high-dose treatment volume may be expanded into the surrounding tissues in order to create the clinical and planning target volumes. Respiratory motion can induce errors in target volume delineation and dose delivery in radiation therapy for thoracic and abdominal cancers. Historically, radiotherapy treatment planning in the thoracic and abdominal regions has used 2D or 3D images acquired under uncoached free-breathing conditions, irrespective of whether the target tumor is moving or not. Once the gross target volume has been delineated, standard margins are commonly added in order to account for motion. However, the generic margins do not usually take the target motion trajectory into consideration. That may lead to under- or over-estimate motion with subsequent risk of missing the target during treatment or irradiating excessive normal tissue. That introduces systematic errors into treatment planning and delivery. In clinical practice, four-dimensional (4D) imaging has been popular in For RT motion management. It provides temporal information about tumor and organ at risk motion, and it permits patient-specific treatment planning. The most common contemporary imaging technique for identifying tumor motion is 4D computed tomography (4D-CT). However, CT has poor soft tissue contrast and it induce ionizing radiation hazard. In the last decade, 4D magnetic resonance imaging (4D-MRI) has become an emerging tool to image respiratory motion, especially in the abdomen, because of the superior soft-tissue contrast. Recently, several 4D-MRI techniques have been proposed, including prospective and retrospective approaches. Nevertheless, 4D-MRI techniques are faced with several challenges: 1) suboptimal and inconsistent tumor contrast with large inter-patient variation; 2) relatively low temporal-spatial resolution; 3) it lacks a reliable respiratory surrogate. In this research work, novel 4D-MRI techniques applying MRI weightings that was not used in existing 4D-MRI techniques, including T2/T1-weighted, T2-weighted and Diffusion-weighted MRI were investigated. A result-driven phase retrospective sorting method was proposed, and it was applied to image space as well as k-space of MR imaging. Novel image-based respiratory surrogates were developed, improved and evaluated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rolling Isolation Systems provide a simple and effective means for protecting components from horizontal floor vibrations. In these systems a platform rolls on four steel balls which, in turn, rest within shallow bowls. The trajectories of the balls is uniquely determined by the horizontal and rotational velocity components of the rolling platform, and thus provides nonholonomic constraints. In general, the bowls are not parabolic, so the potential energy function of this system is not quadratic. This thesis presents the application of Gauss's Principle of Least Constraint to the modeling of rolling isolation platforms. The equations of motion are described in terms of a redundant set of constrained coordinates. Coordinate accelerations are uniquely determined at any point in time via Gauss's Principle by solving a linearly constrained quadratic minimization. In the absence of any modeled damping, the equations of motion conserve energy. This mathematical model is then used to find the bowl profile that minimizes response acceleration subject to displacement constraint.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The effectiveness of an optimization algorithm can be reduced to its ability to navigate an objective function’s topology. Hybrid optimization algorithms combine various optimization algorithms using a single meta-heuristic so that the hybrid algorithm is more robust, computationally efficient, and/or accurate than the individual algorithms it is made of. This thesis proposes a novel meta-heuristic that uses search vectors to select the constituent algorithm that is appropriate for a given objective function. The hybrid is shown to perform competitively against several existing hybrid and non-hybrid optimization algorithms over a set of three hundred test cases. This thesis also proposes a general framework for evaluating the effectiveness of hybrid optimization algorithms. Finally, this thesis presents an improved Method of Characteristics Code with novel boundary conditions, which better characterizes pipelines than previous codes. This code is coupled with the hybrid optimization algorithm in order to optimize the operation of real-world piston pumps.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mercury concentrations ([Hg]) in Arctic food fish often exceed guidelines for human subsistence consumption. Previous research on two food fish species, Arctic char (Salvelinus alpinus) and lake trout (Salvelinus namaycush), indicates that anadromous fish have lower [Hg] than nonanadromous fish, but there have been no intraregional comparisons. Also, no comparisons of [Hg] among anadromous (sea-run), resident (marine access but do not migrate), and landlocked (no marine access) life history types of Arctic char and lake trout have been published. Using intraregional data from 10 lakes in the West Kitikmeot area of Nunavut, Canada, we found that [Hg] varied significantly among species and life history types. Differences among species-life history types were best explained by age-at-size and C:N ratios (indicator of lipid); [Hg] was significantly and negatively related to both. At a standardized fork length of 500 mm, lake trout had significantly higher [Hg] (mean 0.17 µg/g wet wt) than Arctic char (0.09 µg/g). Anadromous and resident Arctic char had significantly lower [Hg] (each 0.04 µg/g) than landlocked Arctic char (0.19 µg/g). Anadromous lake trout had significantly lower [Hg] (0.12 µg/g) than resident lake trout (0.18 µg/g), but no significant difference in [Hg] was seen between landlocked lake trout (0.21 µg/g) and other life history types. Our results are relevant to human health assessments and consumption guidance and will inform models of Hg accumulation in Arctic fish.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Marine sediment records from the Oligocene and Miocene reveal clear 400,000-year (400-kyr) climate cycles related to variations in orbital eccentricity. These cycles are also observed in the Plio-Pleistocene records of the global carbon cycle. However they are absent in the Late Pleistocene ice-age record over the past 1.5 million years. Here, we present a simulation of global ice volume over the past 5 million years with a coupled system of four 3-D ice-sheet models. Our simulation shows that the 400-kyr long eccentricity cycles of Antarctica vary coherently with d13C records during the Pleistocene suggesting that they drive the long-term carbon cycle changes throughout the past 35 million years. The 400-kyr response of Antarctica is eventually suppressed by the dominant 100-kyr glacial cycles of the large ice sheets in the Northern Hemisphere (NH).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Full Text / Article complet

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The premiere of Season Two of Twin Peaks garnered some of the highest ratings of the series, with celebrated filmmaker and co-creator David Lynch stepping back into the director’s chair. Yet, within this episode many traditional television conventions are flouted, and in response the following week the ratings dropped dramatically. From its slow-paced opening scenes in which an old man admonishes the wounded, bleeding protagonist to drink his warm milk before it gets cold, followed by a vision of a giant speaking in riddles, this episode not only tested its audience’s patience but also seemed to set out to deliberately confuse them. In this essay I will explore how this episode is an example of auteur television, an episode in which the director expresses a consistency of style and theme that is similar to their other work, as well as examine how Lynch’s approach to televisual aesthetics has influenced the way that contemporary film directors have crossed over into the television medium. However, when taking into account the differences in the two media of film and television notions of authorship, with regards to the position of the director, become complicated, especially when considering contemporary television and the rise of the showrunner as key creative force. Even when looking back at Lynch’s contribution to Twin Peaks it becomes clear that the series was deeply collaborative, with Lynch absent during parts of the filming. Yet, when examining the extensive material that has been written about Twin Peaks there is still a continuing tendency to place Lynch as the sole author. The placement of Lynch as author can be argued in relation to the episodes he directed (as will be explored below in relation to the first episode of Season Two), but cannot be attributed to him alone when considering the series as a whole. Finally, I will discuss how the figure of the television auteur has become a central element of television reception rather than production, an integral part of a viewer’s search for narrative meaning in a medium where complexity and mystery are now expected and enjoyed. Just as fans scrambled to uncover the many secrets and mysteries of Twin Peaks by looking to Lynch’s other works for answers, a similar process is experienced by fans of television shows existing today.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Studies on hacking have typically focused on motivational aspects and general personality traits of the individuals who engage in hacking; little systematic research has been conducted on predispositions that may be associated not only with the choice to pursue a hacking career but also with performance in either naïve or expert populations. Here, we test the hypotheses that two traits that are typically enhanced in autism spectrum disorders—attention to detail and systemizing—may be positively related to both the choice of pursuing a career in information security and skilled performance in a prototypical hacking task (i.e., crypto-analysis or code-breaking). A group of naïve participants and of ethical hackers completed the Autism Spectrum Quotient, including an attention to detail scale, and the Systemizing Quotient (Baron-Cohen et al., 2001, 2003). They were also tested with behavioral tasks involving code-breaking and a control task involving security X-ray image interpretation. Hackers reported significantly higher systemizing and attention to detail than non-hackers. We found a positive relation between self-reported systemizing (but not attention to detail) and code-breaking skills in both hackers and non-hackers, whereas attention to detail (but not systemizing) was related with performance in the X-ray screening task in both groups, as previously reported with naïve participants (Rusconi et al., 2015). We discuss the theoretical and translational implications of our findings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study, thermal, exergetic analysis and performance evaluation of seawater and fresh wet cooling tower and the effect of parameters on its performance is investigated. With using of energy and mass balance equations, experimental results, a mathematical model and EES code developed. Due to lack of fresh water, seawater cooling is interesting choice for future of cooling, so the effect of seawater in the range of 1gr/kg to 60gr/kg for salinity on the performance characteristics like air efficiency, water efficiency, output water temperature of cooling tower, flow of the exergy, and the exergy efficiency with comparison with fresh water examined. Decreasing of air efficiency about 3%, increasing of water efficiency about 1.5% are some of these effects. Moreover with formation of fouling the performance of cooling tower decreased about 15% which this phenomena and its effects like increase in output water temperature and tower excess volume has been showed and also accommodate with others work. Also optimization for minimizing cost, maximizing air efficiency, and minimizing exergy destruction has been done, results showed that optimization on minimizing the exergy destruction has been satisfy both minimization of the cost and the maximization of the air efficiency, although it will not necessarily permanent for all inputs and optimizations. Validation of this work is done by comparing computational results and experimental data which showed that the model have a good accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We analyze the causal structure of the two-dimensional (2D) reduced background used in the perturbative treatment of a head-on collision of two D-dimensional Aichelburg–Sexl gravitational shock waves. After defining all causal boundaries, namely the future light-cone of the collision and the past light-cone of a future observer, we obtain characteristic coordinates using two independent methods. The first is a geometrical construction of the null rays which define the various light cones, using a parametric representation. The second is a transformation of the 2D reduced wave operator for the problem into a hyperbolic form. The characteristic coordinates are then compactified allowing us to represent all causal light rays in a conformal Carter–Penrose diagram. Our construction holds to all orders in perturbation theory. In particular, we can easily identify the singularities of the source functions and of the Green’s functions appearing in the perturbative expansion, at each order, which is crucial for a successful numerical evaluation of any higher order corrections using this method.