904 resultados para National Science Foundation (U.S.). Research Applied to National Needs Program.
Resumo:
Background: Nanotechnologies are developing very rapidly and nanomaterials (NMs) are increasingly being used in a wide range of applications in science, industry and biomedicine.
Resumo:
One of the most significant research topics in computer vision is object detection. Most of the reported object detection results localise the detected object within a bounding box, but do not explicitly label the edge contours of the object. Since object contours provide a fundamental diagnostic of object shape, some researchers have initiated work on linear contour feature representations for object detection and localisation. However, linear contour feature-based localisation is highly dependent on the performance of linear contour detection within natural images, and this can be perturbed significantly by a cluttered background. In addition, the conventional approach to achieving rotation-invariant features is to rotate the feature receptive field to align with the local dominant orientation before computing the feature representation. Grid resampling after rotation adds extra computational cost and increases the total time consumption for computing the feature descriptor. Though it is not an expensive process if using current computers, it is appreciated that if each step of the implementation is faster to compute especially when the number of local features is increasing and the application is implemented on resource limited ”smart devices”, such as mobile phones, in real-time. Motivated by the above issues, a 2D object localisation system is proposed in this thesis that matches features of edge contour points, which is an alternative method that takes advantage of the shape information for object localisation. This is inspired by edge contour points comprising the basic components of shape contours. In addition, edge point detection is usually simpler to achieve than linear edge contour detection. Therefore, the proposed localization system could avoid the need for linear contour detection and reduce the pathological disruption from the image background. Moreover, since natural images usually comprise many more edge contour points than interest points (i.e. corner points), we also propose new methods to generate rotation-invariant local feature descriptors without pre-rotating the feature receptive field to improve the computational efficiency of the whole system. In detail, the 2D object localisation system is achieved by matching edge contour points features in a constrained search area based on the initial pose-estimate produced by a prior object detection process. The local feature descriptor obtains rotation invariance by making use of rotational symmetry of the hexagonal structure. Therefore, a set of local feature descriptors is proposed based on the hierarchically hexagonal grouping structure. Ultimately, the 2D object localisation system achieves a very promising performance based on matching the proposed features of edge contour points with the mean correct labelling rate of the edge contour points 0.8654 and the mean false labelling rate 0.0314 applied on the data from Amsterdam Library of Object Images (ALOI). Furthermore, the proposed descriptors are evaluated by comparing to the state-of-the-art descriptors and achieve competitive performances in terms of pose estimate with around half-pixel pose error.
Resumo:
Rigid adherence to pre-specified thresholds and static graphical representations can lead to incorrect decisions on merging of clusters. As an alternative to existing automated or semi-automated methods, we developed a visual analytics approach for performing hierarchical clustering analysis of short time-series gene expression data. Dynamic sliders control parameters such as the similarity threshold at which clusters are merged and the level of relative intra-cluster distinctiveness, which can be used to identify "weak-edges" within clusters. An expert user can drill down to further explore the dendrogram and detect nested clusters and outliers. This is done by using the sliders and by pointing and clicking on the representation to cut the branches of the tree in multiple-heights. A prototype of this tool has been developed in collaboration with a small group of biologists for analysing their own datasets. Initial feedback on the tool has been positive.
Resumo:
The aim of the present study was to propose and evaluate the use of factor analysis (FA) in obtaining latent variables (factors) that represent a set of pig traits simultaneously, for use in genome-wide selection (GWS) studies. We used crosses between outbred F2 populations of Brazilian Piau X commercial pigs. Data were obtained on 345 F2 pigs, genotyped for 237 SNPs, with 41 traits. FA allowed us to obtain four biologically interpretable factors: ?weight?, ?fat?, ?loin?, and ?performance?. These factors were used as dependent variables in multiple regression models of genomic selection (Bayes A, Bayes B, RR-BLUP, and Bayesian LASSO). The use of FA is presented as an interesting alternative to select individuals for multiple variables simultaneously in GWS studies; accuracy measurements of the factors were similar to those obtained when the original traits were considered individually. The similarities between the top 10% of individuals selected by the factor, and those selected by the individual traits, were also satisfactory. Moreover, the estimated markers effects for the traits were similar to those found for the relevant factor.
Resumo:
Three-dimensional direct numerical simulations (DNS) have been performed on a finite-size hemispherecylinder model at angle of attack AoA = 20◦ and Reynolds numbers Re = 350 and 1000. Under these conditions, massive separation exists on the nose and lee-side of the cylinder, and at both Reynolds numbers the flow is found to be unsteady. Proper orthogonal decomposition (POD) and dynamic mode decomposition (DMD) are employed in order to study the primary instability that triggers unsteadiness at Re = 350. The dominant coherent flow structures identified at the lower Reynolds number are also found to exist at Re = 1000; the question is then posed whether the flow oscillations and structures found at the two Reynolds numbers are related. POD and DMD computations are performed using different subdomains of the DNS computational domain. Besides reducing the computational cost of the analyses, this also permits to isolate spatially localized oscillatory structures from other, more energetic structures present in the flow. It is found that POD and DMD are in general sensitive to domain truncation and noneducated choices of the subdomain may lead to inconsistent results. Analyses at Re = 350 show that the primary instability is related to the counter rotating vortex pair conforming the three-dimensional afterbody wake, and characterized by the frequency St ≈ 0.11, in line with results in the literature. At Re = 1000, vortex-shedding is present in the wake with an associated broadband spectrum centered around the same frequency. The horn/leeward vortices at the cylinder lee-side, upstream of the cylinder base, also present finite amplitude oscillations at the higher Reynolds number. The spatial structure of these oscillations, described by the POD modes, is easily differentiated from that of the wake oscillations. Additionally, the frequency spectra associated with the lee-side vortices presents well defined peaks, corresponding to St ≈ 0.11 and its few harmonics, as opposed to the broadband spectrum found at the wake.
Resumo:
Attention Deficit-Hyperactivity Disorder is a disease that affects 3 to 5 percent of children globally. Many of those live in areas with very few or no medical professionals qualified to help them. To help assuage this problem a system was developed that allows physicians to accompany their patient’s progress and prescribe treatments. These treatments can be drugs or behavioral exercises. The behavioral exercises were designed in the form of games in order to motivate the patients, children, for the treatment. The system allows the patients to play the prescribed games, under the supervision of their tutors. Each game is designed to improve the patient’s handling of their disease through training in a specific mental component. The objective of this approach is to complement the traditional form of treatment by allowing a physician to prescribe therapeutic games and maintaining the patients under supervision between their regular consultations. The main goal of this project is to provide the patients with a better control of their symptoms that with just traditional therapy. Experimental field tests with children and clinical staff, offer promising results. This research is developed in the context of a financed project involving INESC C (Polytechnic Institute of Leiria delegation), the Santo André Hospital of Leiria, and the start-up company PlusrootOne (that owns the project).
Resumo:
Introduction: Schools provide the opportunity to reach a large number of adolescents in a systematic way however there are increasing demands on curriculum providing challenges for health promotion activities. This paper will describe the research processes and strategies used to design an injury prevention program.----- Methods: A multi-stage process of data collection included: (1) Surveys on injury-risk behaviours to identify targets of change (examining behaviour and risk/ protective factors among more than 4000 adolescents); (2) Focus groups (n= 30 high-risk adolescents) to understand and determine risk situations; (3) Hospital emergency outpatients survey to understand injury types/ situations; (4) Workshop (n= 50 teachers/ administrators) to understand the target curriculum and experiences with injury-risk behaviours; (5) Additional focus groups (students and teachers) regarding draft material and processes.----- Results: Summaries of findings from each stage are presented particularly demonstrating the design process. The baseline data identified target risk and protective factors. The following qualitative study provided detail about content and context and with the hospital findings assisted in developing ways to ensure relevance and meaning (e.g. identifying high risk situations and providing insights into language, culture and development). School staff identified links to school processes with final data providing feedback on curriculum fit, feasibility and appropriateness of resources. The data were integrated into a program which demonstrated reduced injury.----- Conclusions: A comprehensive research process is required to develop an informed and effective intervention. The next stage of a cluster randomised control trial is a major task and justifies the intensive and comprehensive development.
A research framework to investigate the performance of financial incentives in construction projects
Resumo:
This paper looks at the challenges presented for the Australian Library and Information Association by its role as the professional association responsible for ensuring the quality of Australian library technician graduates. There is a particular focus on the issue of course recognition, where the Association's role is complicated by the need to work alongside the national quality assurance processes that have been established by the relevant technical education authorities. The paper describes the history of course recognition in Australia; examines the relationship between course recognition and other quality measures; and describes the process the Association has undertaken recently to ensure appropriate professional scrutiny in a changing environment of accountability.
Resumo:
This paper details a systematic literature review identifying problems in extant research relating to teachers’ attitudes towards reporting child sexual abuse, and offers a model for new attitude scale development and testing. Scale development comprised a five-phase process grounded in contemporary attitude theories including: a) developing the initial item pool; b) conducting a panel review; c) refining the scale via an expert focus group; d) building content validity through cognitive interviews; e) assessing internal consistency via field testing. The resulting 21-item scale displayed construct validity in preliminary testing. The scale may prove useful as a research tool, given the theoretical supposition that attitudes may be changed with time, context, experience, and education. Further investigation with a larger sample is warranted.
Resumo:
The Ashgate Research Companion to Queer Theory is a solid collection providing an overview of the past, present and future applications of queer theory. The Companion confidently lives up to its name as a research companion offering useful theories and methodologies for the reader to utilise queer theory in their own work.