993 resultados para code level


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Scientists planning to use underwater stereoscopic image technologies are often faced with numerous problems during the methodological implementations: commercial equipment is too expensive; the setup or calibration is too complex; or the imaging processing (i.e. measuring objects in the stereo-images) is too complicated to be performed without a time-consuming phase of training and evaluation. The present paper addresses some of these problems and describes a workflow for stereoscopic measurements for marine biologists. It also provides instructions on how to assemble an underwater stereo-photographic system with two digital consumer cameras and gives step-by-step guidelines for setting up the hardware. The second part details a software procedure to correct stereo-image pairs for lens distortions, which is especially important when using cameras with non-calibrated optical units. The final part presents a guide to the process of measuring the lengths (or distances) of objects in stereoscopic image pairs. To reveal the applicability and the restrictions of the described systems and to test the effects of different types of camera (a compact camera and an SLR type), experiments were performed to determine the precision and accuracy of two generic stereo-imaging units: a diver-operated system based on two Olympus Mju 1030SW compact cameras and a cable-connected observatory system based on two Canon 1100D SLR cameras. In the simplest setup without any correction for lens distortion, the low-budget Olympus Mju 1030SW system achieved mean accuracy errors (percentage deviation of a measurement from the object's real size) between 10.2 and -7.6% (overall mean value: -0.6%), depending on the size, orientation and distance of the measured object from the camera. With the single lens reflex (SLR) system, very similar values between 10.1% and -3.4% (overall mean value: -1.2%) were observed. Correction of the lens distortion significantly improved the mean accuracy errors of either system. Even more, system precision (spread of the accuracy) improved significantly in both systems. Neither the use of a wide-angle converter nor multiple reassembly of the system had a significant negative effect on the results. The study shows that underwater stereophotography, independent of the system, has a high potential for robust and non-destructive in situ sampling and can be used without prior specialist training.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Based on a radiocarbon and paleomagnetically dated sediment record from the northern Red Sea and the exceptional sensitivity of the regional changes in the oxygen isotope composition of sea water to the sea-level-dependent water exchange with the Indian Ocean, we provide a new global sea-level reconstruction spanning the last glacial period. The sea-level record has been extracted from the temperature-corrected benthic stable oxygen isotopes using coral-based sea-level data as constraints for the sea-level/oxygen isotope relationship. Although, the general features of this millennial-scale sea-level records have strong similarities to the rather symmetric and gradual Southern Hemisphere climate patterns, we observe, in constrast to previous findings, pronounced sea level rises of up to 25 m to generally correspond with Northern Hemisphere warmings as recorded in Greenland ice-core interstadial intervals whereas sea-level lowstands mostly occur during cold phases. Corroborated by CLIMBER-2 model results, the close connection of millennial-scale sea-level changes to Northern Hemisphere temperature variations indicates a primary climatic control on the mass balance of the major Northern Hemisphere ice sheets and does not require a considerable Antarctic contribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Water removal in paper manufacturing is an energy-intensive process. The dewatering process generally consists of four stages of which the first three stages include mechanical water removal through gravity filtration, vacuum dewatering and wet pressing. In the fourth stage, water is removed thermally, which is the most expensive stage in terms of energy use. In order to analyse water removal during a vacuum dewatering process, a numerical model was created by using a Level-Set method. Various different 2D structures of the paper model were created in MATLAB code with randomly positioned circular fibres with identical orientation. The model considers the influence of the forming fabric which supports the paper sheet during the dewatering process, by using volume forces to represent flow resistance in the momentum equation. The models were used to estimate the dry content of the porous structure for various dwell times. The relation between dry content and dwell time was compared to laboratory data for paper sheets with basis weights of 20 and 50 g/m2 exposed to vacuum levels between 20 kPa and 60 kPa. The comparison showed reasonable results for dewatering and air flow rates. The random positioning of the fibres influences the dewatering rate slightly. In order to achieve more accurate comparisons, the random orientation of the fibres needs to be considered, as well as the deformation and displacement of the fibres during the dewatering process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Unstructured mesh based codes for the modelling of continuum physics phenomena have evolved to provide the facility to model complex interacting systems. Such codes have the potential to provide a high performance on parallel platforms for a small investment in programming. The critical parameters for success are to minimise changes to the code to allow for maintenance while providing high parallel efficiency, scalability to large numbers of processors and portability to a wide range of platforms. The paradigm of domain decomposition with message passing has for some time been demonstrated to provide a high level of efficiency, scalability and portability across shared and distributed memory systems without the need to re-author the code into a new language. This paper addresses these issues in the parallelisation of a complex three dimensional unstructured mesh Finite Volume multiphysics code and discusses the implications of automating the parallelisation process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Constant false alarm rate (CFAR) techniques can be used in Pseudo-Noise (PN) code acquisition in Spread Spectrum (SS) communication systems, and all the CFAR techniques perform well in homogeneous background PN code acquisition. However, in non-homogeneous background, some CFAR techniques suffer rapid degradation. GO/SO (Greatest-of/Smallest-of) CFAR and adaptive censored mean level detector (ACMLD) are two adaptive CFAR techniques, which are analyzed and compared with other CFAR techniques. The simulation results show that GO/SO CFAR is superior to other CFAR techniques, it maintains short mean acquisition time (MAT) even at environment with strong clutter noise, and ACMLD is suitable for background with strong interfering targets

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A fundamental step in understanding the effects of irradiation on metallic uranium and uranium dioxide ceramic fuels, or any material, must start with the nature of radiation damage on the atomic level. The atomic damage displacement results in a multitude of defects that influence the fuel performance. Nuclear reactions are coupled, in that changing one variable will alter others through feedback. In the field of fuel performance modeling, these difficulties are addressed through the use of empirical models rather than models based on first principles. Empirical models can be used as a predictive code through the careful manipulation of input variables for the limited circumstances that are closely tied to the data used to create the model. While empirical models are efficient and give acceptable results, these results are only applicable within the range of the existing data. This narrow window prevents modeling changes in operating conditions that would invalidate the model as the new operating conditions would not be within the calibration data set. This work is part of a larger effort to correct for this modeling deficiency. Uranium dioxide and metallic uranium fuels are analyzed through a kinetic Monte Carlo code (kMC) as part of an overall effort to generate a stochastic and predictive fuel code. The kMC investigations include sensitivity analysis of point defect concentrations, thermal gradients implemented through a temperature variation mesh-grid, and migration energy values. In this work, fission damage is primarily represented through defects on the oxygen anion sublattice. Results were also compared between the various models. Past studies of kMC point defect migration have not adequately addressed non-standard migration events such as clustering and dissociation of vacancies. As such, the General Utility Lattice Program (GULP) code was utilized to generate new migration energies so that additional non-migration events could be included into kMC code in the future for more comprehensive studies. Defect energies were calculated to generate barrier heights for single vacancy migration, clustering and dissociation of two vacancies, and vacancy migration while under the influence of both an additional oxygen and uranium vacancy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This document does NOT address the issue of oxygen data quality control (either real-time or delayed mode). As a preliminary step towards that goal, this document seeks to ensure that all countries deploying floats equipped with oxygen sensors document the data and metadata related to these floats properly. We produced this document in response to action item 14 from the AST-10 meeting in Hangzhou (March 22-23, 2009). Action item 14: Denis Gilbert to work with Taiyo Kobayashi and Virginie Thierry to ensure DACs are processing oxygen data according to recommendations. If the recommendations contained herein are followed, we will end up with a more uniform set of oxygen data within the Argo data system, allowing users to begin analysing not only their own oxygen data, but also those of others, in the true spirit of Argo data sharing. Indications provided in this document are valid as of the date of writing this document. It is very likely that changes in sensors, calibrations and conversions equations will occur in the future. Please contact V. Thierry (vthierry@ifremer.fr) for any inconsistencies or missing information. A dedicated webpage on the Argo Data Management website (www) contains all information regarding Argo oxygen data management : current and previous version of this cookbook, oxygen sensor manuals, calibration sheet examples, examples of matlab code to process oxygen data, test data, etc..

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the context of computer numerical control (CNC) and computer aided manufacturing (CAM), the capabilities of programming languages such as symbolic and intuitive programming, program portability and geometrical portfolio have special importance -- They allow to save time and to avoid errors during part programming and permit code re-usage -- Our updated literature review indicates that the current state of art presents voids in parametric programming, program portability and programming flexibility -- In response to this situation, this article presents a compiler implementation for EGCL (Extended G-code Language), a new, enriched CNC programming language which allows the use of descriptive variable names, geometrical functions and flow-control statements (if-then-else, while) -- Our compiler produces low-level generic, elementary ISO-compliant Gcode, thus allowing for flexibility in the choice of the executing CNC machine and in portability -- Our results show that readable variable names and flow control statements allow a simplified and intuitive part programming and permit re-usage of the programs -- Future work includes allowing the programmer to define own functions in terms of EGCL, in contrast to the current status of having them as library built-in functions

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Energy Conservation Measure (ECM) project selection is made difficult given real-world constraints, limited resources to implement savings retrofits, various suppliers in the market and project financing alternatives. Many of these energy efficient retrofit projects should be viewed as a series of investments with annual returns for these traditionally risk-averse agencies. Given a list of ECMs available, federal, state and local agencies must determine how to implement projects at lowest costs. The most common methods of implementation planning are suboptimal relative to cost. Federal, state and local agencies can obtain greater returns on their energy conservation investment over traditional methods, regardless of the implementing organization. This dissertation outlines several approaches to improve the traditional energy conservations models. Any public buildings in regions with similar energy conservation goals in the United States or internationally can also benefit greatly from this research. Additionally, many private owners of buildings are under mandates to conserve energy e.g., Local Law 85 of the New York City Energy Conservation Code requires any building, public or private, to meet the most current energy code for any alteration or renovation. Thus, both public and private stakeholders can benefit from this research. The research in this dissertation advances and presents models that decision-makers can use to optimize the selection of ECM projects with respect to the total cost of implementation. A practical application of a two-level mathematical program with equilibrium constraints (MPEC) improves the current best practice for agencies concerned with making the most cost-effective selection leveraging energy services companies or utilities. The two-level model maximizes savings to the agency and profit to the energy services companies (Chapter 2). An additional model presented leverages a single congressional appropriation to implement ECM projects (Chapter 3). Returns from implemented ECM projects are used to fund additional ECM projects. In these cases, fluctuations in energy costs and uncertainty in the estimated savings severely influence ECM project selection and the amount of the appropriation requested. A risk aversion method proposed imposes a minimum on the number of “of projects completed in each stage. A comparative method using Conditional Value at Risk is analyzed. Time consistency was addressed in this chapter. This work demonstrates how a risk-based, stochastic, multi-stage model with binary decision variables at each stage provides a much more accurate estimate for planning than the agency’s traditional approach and deterministic models. Finally, in Chapter 4, a rolling-horizon model allows for subadditivity and superadditivity of the energy savings to simulate interactive effects between ECM projects. The approach makes use of inequalities (McCormick, 1976) to re-express constraints that involve the product of binary variables with an exact linearization (related to the convex hull of those constraints). This model additionally shows the benefits of learning between stages while remaining consistent with the single congressional appropriations framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ant foraging on foliage can substantially affect how phytophagous insects use host plants and represents a high predation risk for caterpillars, which are important folivores. Ant-plant-herbivore interactions are especially pervasive in cerrado savanna due to continuous ant visitation to liquid food sources on foliage (extrafloral nectaries, insect honeydew). While searching for liquid rewards on plants, aggressive ants frequently attack or kill insect herbivores, decreasing their numbers. Because ants vary in diet and aggressiveness, their effect on herbivores also varies. Additionally, the differential occurrence of ant attractants (plant and insect exudates) on foliage produces variable levels of ant foraging within local floras and among localities. Here, we investigate how variation of ant communities and of traits among host plant species (presence or absence of ant attractants) can change the effect of carnivores (predatory ants) on herbivore communities (caterpillars) in a cerrado savanna landscape. We sampled caterpillars and foliage-foraging ants in four cerrado localities (70-460 km apart). We found that: (i) caterpillar infestation was negatively related with ant visitation to plants; (ii) this relationship depended on local ant abundance and species composition, and on local preference by ants for plants with liquid attractants; (iii) this was not related to local plant richness or plant size; (iv) the relationship between the presence of ant attractants and caterpillar abundance varied among sites from negative to neutral; and (v) caterpillars feeding on plants with ant attractants are more resistant to ant predation than those feeding on plants lacking attractants. Liquid food on foliage mediates host plant quality for lepidopterans by promoting generalized ant-caterpillar antagonism. Our study in cerrado shows that the negative effects of generalist predatory ants on herbivores are detectable at a community level, affecting patterns of abundance and host plant use by lepidopterans. The magnitude of ant-induced effects on caterpillar occurrence across the cerrado landscape may depend on how ants use plants locally and how they respond to liquid food on plants at different habitats. This study enhances the relevance of plant-ant and ant-herbivore interactions in cerrado and highlights the importance of a tritrophic perspective in this ant-rich environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study was to assess whether the adhesive permits the collateral repair of axons originating from a vagus nerve to the interior of a sural nerve graft, and whether low-level laser therapy (LLLT) assists in the regeneration process. Study sample consisted of 32 rats randomly separated into three groups: Control Group (CG; n=8), from which the intact sural nerve was collected; Experimental Group (EG; n=12), in which one of the ends of the sural nerve graft was coapted to the vagus nerve using the fibrin glue; and Experimental Group Laser (EGL; n=12), in which the animals underwent the same procedures as those in EG with the addition of LLLT. Ten weeks after surgery, the animals were euthanized. Morphological analysis by means of optical and electron microscopy, and morphometry of the regenerated fibers were employed to evaluate the results. Collateral regeneration of axons was observed from the vagus nerve to the interior of the autologous graft in EG and EGL, and in CG all dimensions measured were greater and presented a significant difference in relation to EG and EGL, except for the area and thickness of the myelin sheath, that showed significant difference only in relation to the EG. The present study demonstrated that the fibrin glue makes axonal regeneration feasible and is an efficient method to recover injured peripheral nerves, and the use of low-level laser therapy enhances nerve regeneration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neutrophils (PMN) play a central role in host defense against the neglected fungal infection paracoccidioidomycosis (PCM), which is caused by the dimorphic fungus Paracoccidioides brasiliensis (Pb). PCM is of major importance, especially in Latin America, and its treatment relies on the use of antifungal drugs. However, the course of treatment is lengthy, leading to side effects and even development of fungal resistance. The goal of the study was to use low-level laser therapy (LLLT) to stimulate PMN to fight Pb in vivo. Swiss mice with subcutaneous air pouches were inoculated with a virulent strain of Pb or fungal cell wall components (Zymosan), and then received LLLT (780 nm; 50 mW; 12.5 J/cm2; 30 seconds per point, giving a total energy of 0.5 J per point) on alternate days at two points on each hind leg. The aim was to reach the bone marrow in the femur with light. Non-irradiated animals were used as controls. The number and viability of the PMN that migrated to the inoculation site was assessed, as well as their ability to synthesize proteins, produce reactive oxygen species (ROS) and their fungicidal activity. The highly pure PMN populations obtained after 10 days of infection were also subsequently cultured in the presence of Pb for trials of protein production, evaluation of mitochondrial activity, ROS production and quantification of viable fungi growth. PMN from mice that received LLLT were more active metabolically, had higher fungicidal activity against Pb in vivo and also in vitro. The kinetics of neutrophil protein production also correlated with a more activated state. LLLT may be a safe and non-invasive approach to deal with PCM infection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evolving interfaces were initially focused on solutions to scientific problems in Fluid Dynamics. With the advent of the more robust modeling provided by Level Set method, their original boundaries of applicability were extended. Specifically to the Geometric Modeling area, works published until then, relating Level Set to tridimensional surface reconstruction, centered themselves on reconstruction from a data cloud dispersed in space; the approach based on parallel planar slices transversal to the object to be reconstructed is still incipient. Based on this fact, the present work proposes to analyse the feasibility of Level Set to tridimensional reconstruction, offering a methodology that simultaneously integrates the proved efficient ideas already published about such approximation and the proposals to process the inherent limitations of the method not satisfactorily treated yet, in particular the excessive smoothing of fine characteristics of contours evolving under Level Set. In relation to this, the application of the variant Particle Level Set is suggested as a solution, for its intrinsic proved capability to preserve mass of dynamic fronts. At the end, synthetic and real data sets are used to evaluate the presented tridimensional surface reconstruction methodology qualitatively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evolving interfaces were initially focused on solutions to scientific problems in Fluid Dynamics. With the advent of the more robust modeling provided by Level Set method, their original boundaries of applicability were extended. Specifically to the Geometric Modeling area, works published until then, relating Level Set to tridimensional surface reconstruction, centered themselves on reconstruction from a data cloud dispersed in space; the approach based on parallel planar slices transversal to the object to be reconstructed is still incipient. Based on this fact, the present work proposes to analyse the feasibility of Level Set to tridimensional reconstruction, offering a methodology that simultaneously integrates the proved efficient ideas already published about such approximation and the proposals to process the inherent limitations of the method not satisfactorily treated yet, in particular the excessive smoothing of fine characteristics of contours evolving under Level Set. In relation to this, the application of the variant Particle Level Set is suggested as a solution, for its intrinsic proved capability to preserve mass of dynamic fronts. At the end, synthetic and real data sets are used to evaluate the presented tridimensional surface reconstruction methodology qualitatively.