940 resultados para National Science Foundation (U.S.). Research Applied to National Needs Program.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Nanotechnologies are developing very rapidly and nanomaterials (NMs) are increasingly being used in a wide range of applications in science, industry and biomedicine.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the most significant research topics in computer vision is object detection. Most of the reported object detection results localise the detected object within a bounding box, but do not explicitly label the edge contours of the object. Since object contours provide a fundamental diagnostic of object shape, some researchers have initiated work on linear contour feature representations for object detection and localisation. However, linear contour feature-based localisation is highly dependent on the performance of linear contour detection within natural images, and this can be perturbed significantly by a cluttered background. In addition, the conventional approach to achieving rotation-invariant features is to rotate the feature receptive field to align with the local dominant orientation before computing the feature representation. Grid resampling after rotation adds extra computational cost and increases the total time consumption for computing the feature descriptor. Though it is not an expensive process if using current computers, it is appreciated that if each step of the implementation is faster to compute especially when the number of local features is increasing and the application is implemented on resource limited ”smart devices”, such as mobile phones, in real-time. Motivated by the above issues, a 2D object localisation system is proposed in this thesis that matches features of edge contour points, which is an alternative method that takes advantage of the shape information for object localisation. This is inspired by edge contour points comprising the basic components of shape contours. In addition, edge point detection is usually simpler to achieve than linear edge contour detection. Therefore, the proposed localization system could avoid the need for linear contour detection and reduce the pathological disruption from the image background. Moreover, since natural images usually comprise many more edge contour points than interest points (i.e. corner points), we also propose new methods to generate rotation-invariant local feature descriptors without pre-rotating the feature receptive field to improve the computational efficiency of the whole system. In detail, the 2D object localisation system is achieved by matching edge contour points features in a constrained search area based on the initial pose-estimate produced by a prior object detection process. The local feature descriptor obtains rotation invariance by making use of rotational symmetry of the hexagonal structure. Therefore, a set of local feature descriptors is proposed based on the hierarchically hexagonal grouping structure. Ultimately, the 2D object localisation system achieves a very promising performance based on matching the proposed features of edge contour points with the mean correct labelling rate of the edge contour points 0.8654 and the mean false labelling rate 0.0314 applied on the data from Amsterdam Library of Object Images (ALOI). Furthermore, the proposed descriptors are evaluated by comparing to the state-of-the-art descriptors and achieve competitive performances in terms of pose estimate with around half-pixel pose error.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rigid adherence to pre-specified thresholds and static graphical representations can lead to incorrect decisions on merging of clusters. As an alternative to existing automated or semi-automated methods, we developed a visual analytics approach for performing hierarchical clustering analysis of short time-series gene expression data. Dynamic sliders control parameters such as the similarity threshold at which clusters are merged and the level of relative intra-cluster distinctiveness, which can be used to identify "weak-edges" within clusters. An expert user can drill down to further explore the dendrogram and detect nested clusters and outliers. This is done by using the sliders and by pointing and clicking on the representation to cut the branches of the tree in multiple-heights. A prototype of this tool has been developed in collaboration with a small group of biologists for analysing their own datasets. Initial feedback on the tool has been positive.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of the present study was to propose and evaluate the use of factor analysis (FA) in obtaining latent variables (factors) that represent a set of pig traits simultaneously, for use in genome-wide selection (GWS) studies. We used crosses between outbred F2 populations of Brazilian Piau X commercial pigs. Data were obtained on 345 F2 pigs, genotyped for 237 SNPs, with 41 traits. FA allowed us to obtain four biologically interpretable factors: ?weight?, ?fat?, ?loin?, and ?performance?. These factors were used as dependent variables in multiple regression models of genomic selection (Bayes A, Bayes B, RR-BLUP, and Bayesian LASSO). The use of FA is presented as an interesting alternative to select individuals for multiple variables simultaneously in GWS studies; accuracy measurements of the factors were similar to those obtained when the original traits were considered individually. The similarities between the top 10% of individuals selected by the factor, and those selected by the individual traits, were also satisfactory. Moreover, the estimated markers effects for the traits were similar to those found for the relevant factor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Three-dimensional direct numerical simulations (DNS) have been performed on a finite-size hemispherecylinder model at angle of attack AoA = 20◦ and Reynolds numbers Re = 350 and 1000. Under these conditions, massive separation exists on the nose and lee-side of the cylinder, and at both Reynolds numbers the flow is found to be unsteady. Proper orthogonal decomposition (POD) and dynamic mode decomposition (DMD) are employed in order to study the primary instability that triggers unsteadiness at Re = 350. The dominant coherent flow structures identified at the lower Reynolds number are also found to exist at Re = 1000; the question is then posed whether the flow oscillations and structures found at the two Reynolds numbers are related. POD and DMD computations are performed using different subdomains of the DNS computational domain. Besides reducing the computational cost of the analyses, this also permits to isolate spatially localized oscillatory structures from other, more energetic structures present in the flow. It is found that POD and DMD are in general sensitive to domain truncation and noneducated choices of the subdomain may lead to inconsistent results. Analyses at Re = 350 show that the primary instability is related to the counter rotating vortex pair conforming the three-dimensional afterbody wake, and characterized by the frequency St ≈ 0.11, in line with results in the literature. At Re = 1000, vortex-shedding is present in the wake with an associated broadband spectrum centered around the same frequency. The horn/leeward vortices at the cylinder lee-side, upstream of the cylinder base, also present finite amplitude oscillations at the higher Reynolds number. The spatial structure of these oscillations, described by the POD modes, is easily differentiated from that of the wake oscillations. Additionally, the frequency spectra associated with the lee-side vortices presents well defined peaks, corresponding to St ≈ 0.11 and its few harmonics, as opposed to the broadband spectrum found at the wake.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Attention Deficit-Hyperactivity Disorder is a disease that affects 3 to 5 percent of children globally. Many of those live in areas with very few or no medical professionals qualified to help them. To help assuage this problem a system was developed that allows physicians to accompany their patient’s progress and prescribe treatments. These treatments can be drugs or behavioral exercises. The behavioral exercises were designed in the form of games in order to motivate the patients, children, for the treatment. The system allows the patients to play the prescribed games, under the supervision of their tutors. Each game is designed to improve the patient’s handling of their disease through training in a specific mental component. The objective of this approach is to complement the traditional form of treatment by allowing a physician to prescribe therapeutic games and maintaining the patients under supervision between their regular consultations. The main goal of this project is to provide the patients with a better control of their symptoms that with just traditional therapy. Experimental field tests with children and clinical staff, offer promising results. This research is developed in the context of a financed project involving INESC C (Polytechnic Institute of Leiria delegation), the Santo André Hospital of Leiria, and the start-up company PlusrootOne (that owns the project).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The application of Computational Fluid Dynamics based on the Reynolds-Averaged Navier-Stokes equations to the simulation of bluff body aerodynamics has been thoroughly investigated in the past. Although a satisfactory accuracy can be obtained for some urban physics problems their predictive capability is limited to the mean flow properties, while the ability to accurately predict turbulent fluctuations is recognized to be of fundamental importance when dealing with wind loading and pollution dispersion problems. The need to correctly take into account the flow dynamics when such problems are faced has led researchers to move towards scale-resolving turbulence models such as Large Eddy Simulations (LES). The development and assessment of LES as a tool for the analysis of these problems is nowadays an active research field and represents a demanding engineering challenge. This research work has two objectives. The first one is focused on wind loads assessment and aims to study the capabilities of LES in reproducing wind load effects in terms of internal forces on structural members. This differs from the majority of the existing research, where performance of LES is evaluated only in terms of surface pressures, and is done with a view of adopting LES as a complementary design tools alongside wind tunnel tests. The second objective is the study of LES capabilities in calculating pollutant dispersion in the built environment. The validation of LES in this field is considered to be of the utmost importance in order to conceive healthier and more sustainable cities. In order to validate the numerical setup adopted, a systematic comparison between numerical and experimental data is performed. The obtained results are intended to be used in the drafting of best practice guidelines for the application of LES in the urban physics field with a particular attention to wind load assessment and pollution dispersion problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The design optimization of industrial products has always been an essential activity to improve product quality while reducing time-to-market and production costs. Although cost management is very complex and comprises all phases of the product life cycle, the control of geometrical and dimensional variations, known as Dimensional Management (DM), allows compliance with product and process requirements. Hence, the tolerance-cost optimization becomes the main practice to provide an effective application of Design for Tolerancing (DfT) and Design to Cost (DtC) approaches by enabling a connection between product tolerances and associated manufacturing costs. However, despite the growing interest in this topic, a profitable application in the industry of these techniques is hampered by their complexity: the definition of a systematic framework is the key element to improving design optimization, enhancing the concurrent use of Computer-Aided tools and Model-Based Definition (MBD) practices. The present doctorate research aims to define and develop an integrated methodology for product/process design optimization, to better exploit the new capabilities of advanced simulations and tools. By implementing predictive models and multi-disciplinary optimization, a Computer-Aided Integrated framework for tolerance-cost optimization has been proposed to allow the integration of DfT and DtC approaches and their direct application for the design of automotive components. Several case studies have been considered, with the final application of the integrated framework on a high-performance V12 engine assembly, to achieve both functional targets and cost reduction. From a scientific point of view, the proposed methodology provides an improvement for the tolerance-cost optimization of industrial components. The integration of theoretical approaches and Computer-Aided tools allows to analyse the influence of tolerances on both product performance and manufacturing costs. The case studies proved the suitability of the methodology for its application in the industrial field, providing the identification of further areas for improvement and refinement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the field of Power Electronics, several types of motor control systems have been developed using STM microcontroller and power boards. In both industrial power applications and domestic appliances, power electronic inverters are widely used. Inverters are used to control the torque, speed, and position of the rotor in AC motor drives. An inverter delivers constant-voltage and constant-frequency power in uninterruptible power sources. Because inverter power supplies have a high-power consumption and low transfer efficiency rate, a three-phase sine wave AC power supply was created using the embedded system STM32, which has low power consumption and efficient speed. It has the capacity of output frequency of 50 Hz and the RMS of line voltage. STM32 embedded based Inverter is a power supply that integrates, reduced, and optimized the power electronics application that require hardware system, software, and application solution, including power architecture, techniques, and tools, approaches capable of performance on devices and equipment. Power inverters are currently used and implemented in green energy power system with low energy system such as sensors or microcontroller to perform the operating function of motors and pumps. STM based power inverter is efficient, less cost and reliable. My thesis work was based on STM motor drives and control system which can be implemented in a gas analyser for operating the pumps and motors. It has been widely applied in various engineering sectors due to its ability to respond to adverse structural changes and improved structural reliability. The present research was designed to use STM Inverter board on low power MCU such as NUCLEO with some practical examples such as Blinking LED, and PWM. Then we have implemented a three phase Inverter model with Steval-IPM08B board, which converter single phase 230V AC input to three phase 380 V AC output, the output will be useful for operating the induction motor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This postdoctoral study on the application of the RIME intervention in women that had undergone mastectomy and were in treatment, aimed to promote psychospiritual and social transformations to improve the quality of life, self-esteem and hope. A total of 28 women participated and were randomized into two groups. Brief Psychotherapy (PB) (average of six sessions) was administered in the Control Group, and RIME (three sessions) and BP (average of five sessions) were applied in the RIME Group. The quantitative results indicated a significant improvement (38.3%) in the Perception of Quality of Life after RIME according to the WHOQOL, compared both to the BP of the Control Group (12.5%), and the BP of the RIME Group (16.2%). There was a significant improvement in Self-esteem (Rosenberg) after RIME (14.6%) compared to the BP of the Control Group (worsened 35.9%), and the BP of the RIME Group (8.3%). The improvement in well-being, considering the focus worked on (Visual Analog Scale), was significant in the RIME Group (bad to good), as well as in the Control Group (unpleasant to good). The qualitative results indicated that RIME promotes creative transformations in the intrapsychic and interpersonal dimensions, so that new meanings and/or new attitudes emerge into the consciousness. It was observed that RIME has more strength of psychic structure, ego strengthening and provides a faster transformation that BP, therefore it can be indicated for crisis treatment in the hospital environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work aims at the geochemical study of Pitinga cryolite mineralization through REE and Y analyses in disseminated and massive cryolite ore deposits, as well as in fluorite occurrences. REE signatures in fluorite and cryolite are similar to those in the Madeira albite granite. The highest ΣREE values are found in magmatic cryolite (677 to 1345 ppm); ΣREE is lower in massive cryolite. Average values for the different cryolite types are 10.3 ppm, 6.66 ppm and 8.38 ppm (for nucleated, caramel and white types, respectively). Disseminated fluorite displays higher ΣREE values (1708 and 1526ppm) than fluorite in late veins(34.81ppm). Yttrium concentration is higher in disseminated fluorite and in magmatic cryolite. The evolution of several parameters (REEtotal, LREE/HREE, Y) was followed throughout successive stages of evolution in albite granites and associated mineralization. At the end of the process, late cryolite was formed with low REEtotal content. REE data indicate that the MCD was formed by, and the disseminated ore enriched by (additional formation of hydrothermal disseminated cryolite), hydrothermal fluids, residual from albite granite. The presence of tetrads is poorly defined, although nucleated, caramel and white cryolite types show evidence for tetrad effect.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Shot peening is a cold-working mechanical process in which a shot stream is propelled against a component surface. Its purpose is to introduce compressive residual stresses on component surfaces for increasing the fatigue resistance. This process is widely applied in springs due to the cyclical loads requirements. This paper presents a numerical modelling of shot peening process using the finite element method. The results are compared with experimental measurements of the residual stresses, obtained by the X-rays diffraction technique, in leaf springs submitted to this process. Furthermore, the results are compared with empirical and numerical correlations developed by other authors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work deals with an improved plane frame formulation whose exact dynamic stiffness matrix (DSM) presents, uniquely, null determinant for the natural frequencies. In comparison with the classical DSM, the formulation herein presented has some major advantages: local mode shapes are preserved in the formulation so that, for any positive frequency, the DSM will never be ill-conditioned; in the absence of poles, it is possible to employ the secant method in order to have a more computationally efficient eigenvalue extraction procedure. Applying the procedure to the more general case of Timoshenko beams, we introduce a new technique, named ""power deflation"", that makes the secant method suitable for the transcendental nonlinear eigenvalue problems based on the improved DSM. In order to avoid overflow occurrences that can hinder the secant method iterations, limiting frequencies are formulated, with scaling also applied to the eigenvalue problem. Comparisons with results available in the literature demonstrate the strength of the proposed method. Computational efficiency is compared with solutions obtained both by FEM and by the Wittrick-Williams algorithm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The present work aims at the application of the decision theory to radiological image quality control ( QC) in diagnostic routine. The main problem addressed in the framework of decision theory is to accept or reject a film lot of a radiology service. The probability of each decision of a determined set of variables was obtained from the selected films. Methods: Based on a radiology service routine a decision probability function was determined for each considered group of combination characteristics. These characteristics were related to the film quality control. These parameters were also framed in a set of 8 possibilities, resulting in 256 possible decision rules. In order to determine a general utility application function to access the decision risk, we have used a simple unique parameter called r. The payoffs chosen were: diagnostic's result (correct/incorrect), cost (high/low), and patient satisfaction (yes/no) resulting in eight possible combinations. Results: Depending on the value of r, more or less risk will occur related to the decision-making. The utility function was evaluated in order to determine the probability of a decision. The decision was made with patients or administrators' opinions from a radiology service center. Conclusion: The model is a formal quantitative approach to make a decision related to the medical imaging quality, providing an instrument to discriminate what is really necessary to accept or reject a film or a film lot. The method presented herein can help to access the risk level of an incorrect radiological diagnosis decision.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a physically transparent analytic model of astrophysical S factors as a function of a center-of-mass energy E of colliding nuclei (below and above the Coulomb barrier) for nonresonant fusion reactions. For any given reaction, the S(E) model contains four parameters [two of which approximate the barrier potential, U(r)]. They are easily interpolated along many reactions involving isotopes of the same elements; they give accurate practical expressions for S(E) with only several input parameters for many reactions. The model reproduces the suppression of S(E) at low energies (of astrophysical importance) due to the shape of the low-r wing of U(r). The model can be used to reconstruct U(r) from computed or measured S(E). For illustration, we parametrize our recent calculations of S(E) (using the Sao Paulo potential and the barrier penetration formalism) for 946 reactions involving stable and unstable isotopes of C, O, Ne, and Mg (with nine parameters for all reactions involving many isotopes of the same elements, e. g., C+O). In addition, we analyze astrophysically important (12)C+(12)C reaction, compare theoretical models with experimental data, and discuss the problem of interpolating reliably known S(E) values to low energies (E less than or similar to 2-3 MeV).