173 resultados para zoo map task
Resumo:
Young people in long-term foster care are at risk of experiencing poor social, emotional, behavioural and educational outcomes. Moreover, these placements have a significantly greater chance of breaking down compared to those involving children. This article critically evaluates the factors associated with this particular outcome. It was carried out through a literature review conducted by a social work practitioner in one Health and Social Care Trust in Northern Ireland. The findings evidenced that, apart from overriding safety concerns, placement breakdown was not a one-off event but rather a complex process involving the interplay between a range of dynamic risk and protective factors over time, operating in the wider context of the young person’s history and life experiences. The significance of these findings for social work practitioners is finally considered by identifying key theories to inform understanding and intervention.
Resumo:
The X-linked lymphoproliferative syndrome (XLP) is an inherited immuno-deficiency to Epstein-Barr virus infection that has been mapped to chromosome Xq25. Molecular analysis of XLP patients from ten different families identified a small interstitial constitutional deletion in 1 patient (XLP-D). This deletion, initially defined by a single marker, DF83, known to map to interval Xq24-q26.1, is nested within a previously reported and much larger deletion in another XLP patient (XLP-739). A cosmid minilibrary was constructed from a single mega-YAC and used to establish a contig encompassing the whole XLP-D deletion and a portion of the XLP-739 deletion. Based on this contig, the size of the XLP-D deletion can be estimated at 130 kb. The identification of this minimal deletion, within which at least a portion of the XLP gene is likely to reside, should greatly facilitate efforts in isolating the gene.
Resumo:
This paper describes an investigation of various shroud bleed slot configurations of a centrifugal compressor using CFD with a manual multi-block structured grid generation method. The compressor under investigation is used in a turbocharger application for a heavy duty diesel engine of approximately 400hp. The baseline numerical model has been developed and validated against experimental performance measurements. The influence of the bleed slot flow field on a range of operating conditions between surge and choke has been analysed in detail. The impact of the returning bleed flow on the incidence at the impeller blade leading edge due to its mixing with the main through-flow has also been studied. From the baseline geometry, a number of modifications to the bleed slot width have been proposed, and a detailed comparison of the flow characteristics performed. The impact of slot variations on the inlet incidence angle has been investigated, highlighting the improvement in surge and choked flow capability. Along with this, the influence of the bleed slot on stabilizing the blade passage flow by the suction of the tip and over-tip vortex flow by the slot has been considered near surge.
Resumo:
As data analytics are growing in importance they are also quickly becoming one of the dominant application domains that require parallel processing. This paper investigates the applicability of OpenMP, the dominant shared-memory parallel programming model in high-performance computing, to the domain of data analytics. We contrast the performance and programmability of key data analytics benchmarks against Phoenix++, a state-of-the-art shared memory map/reduce programming system. Our study shows that OpenMP outperforms the Phoenix++ system by a large margin for several benchmarks. In other cases, however, the programming model is lacking support for this application domain.
Resumo:
Single component geochemical maps are the most basic representation of spatial elemental distributions and commonly used in environmental and exploration geochemistry. However, the compositional nature of geochemical data imposes several limitations on how the data should be presented. The problems relate to the constant sum problem (closure), and the inherently multivariate relative information conveyed by compositional data. Well known is, for instance, the tendency of all heavy metals to show lower values in soils with significant contributions of diluting elements (e.g., the quartz dilution effect); or the contrary effect, apparent enrichment in many elements due to removal of potassium during weathering. The validity of classical single component maps is thus investigated, and reasonable alternatives that honour the compositional character of geochemical concentrations are presented. The first recommended such method relies on knowledge-driven log-ratios, chosen to highlight certain geochemical relations or to filter known artefacts (e.g. dilution with SiO2 or volatiles). This is similar to the classical normalisation approach to a single element. The second approach uses the (so called) log-contrasts, that employ suitable statistical methods (such as classification techniques, regression analysis, principal component analysis, clustering of variables, etc.) to extract potentially interesting geochemical summaries. The caution from this work is that if a compositional approach is not used, it becomes difficult to guarantee that any identified pattern, trend or anomaly is not an artefact of the constant sum constraint. In summary the authors recommend a chain of enquiry that involves searching for the appropriate statistical method that can answer the required geological or geochemical question whilst maintaining the integrity of the compositional nature of the data. The required log-ratio transformations should be applied followed by the chosen statistical method. Interpreting the results may require a closer working relationship between statisticians, data analysts and geochemists.
Resumo:
The environmental quality of land is often assessed by the calculation of threshold values which aim to differentiate between concentrations of elements based on whether the soils are in residential or industrial sites. In Europe, for example, soil guideline values exist for agricultural and grazing land. A threshold is often set to differentiate between concentrations of the element that naturally occur in the soil and concentrations that result from diffuse anthropogenic sources. Regional geochemistry and, in particular, single component geochemical maps are increasingly being used to determine these baseline environmental assessments. The key question raised in this paper is whether the geochemical map can provide an accurate interpretation on its own. Implicit is the thought that single component geochemical maps represent absolute abundances. However,because of the compositional (closed) nature of the data univariate geochemical maps cannot be compared directly with one another.. As a result, any interpretation based on them is vulnerable to spurious correlation problems. What does this mean for soil geochemistry mapping, baseline quality documentation, soil resource assessment or risk evaluation? Despite the limitation of relative abundances, individual raw geochemical maps are deemed fundamental to several applications of geochemical maps including environmental assessments. However, element toxicity is related to its bioavailable concentration, which is lowered if its source is mixed with another source. Elements interact, for example under reducing conditions with iron oxides, its solid state is lost and arsenic becomes soluble and mobile. Both of these matters may be more adequately dealt with if a single component map is not interpreted in isolation to determine baseline and threshold assessments. A range of alternative compositionally compliant representations based on log-ratio and log-contrast approaches are explored to supplement the classical single component maps for environmental assessment. Case study examples are shown based on the Tellus soil geochemical dataset, covering Northern Ireland and the results of in vitro oral bioaccessibility testing carried out on a sub-set of archived Tellus Survey shallow soils following the Unified BARGE (Bioaccessibility Research Group of Europe).
Resumo:
We study the computational complexity of finding maximum a posteriori configurations in Bayesian networks whose probabilities are specified by logical formulas. This approach leads to a fine grained study in which local information such as context-sensitive independence and determinism can be considered. It also allows us to characterize more precisely the jump from tractability to NP-hardness and beyond, and to consider the complexity introduced by evidence alone.
Resumo:
Predicting the next location of a user based on their previous visiting pattern is one of the primary tasks over data from location based social networks (LBSNs) such as Foursquare. Many different aspects of these so-called “check-in” profiles of a user have been made use of in this task, including spatial and temporal information of check-ins as well as the social network information of the user. Building more sophisticated prediction models by enriching these check-in data by combining them with information from other sources is challenging due to the limited data that these LBSNs expose due to privacy concerns. In this paper, we propose a framework to use the location data from LBSNs, combine it with the data from maps for associating a set of venue categories with these locations. For example, if the user is found to be checking in at a mall that has cafes, cinemas and restaurants according to the map, all these information is associated. This category information is then leveraged to predict the next checkin location by the user. Our experiments with publicly available check-in dataset show that this approach improves on the state-of-the-art methods for location prediction.
Resumo:
In a number of species, individuals showing lateralized hand/paw usage (i.e. the preferential use of either the right or left paw) compared to ambilateral individuals have been shown to be more proactive in novel situations. In the current study we used an established test to assess preferential paw usage in dogs (the Kong test) and then compared the performance of ambilateral and lateralized dogs as well as left- vs. right-pawed dogs in a novel manipulative problem solving task. Results showed an equal proportion of ambilateral and lateralized dogs but contrary to predictions non-lateralized dogs were faster at accessing the apparatus in test trials. No differences emerged between right- and left-pawed dogs. Results are discussed in relation to previous studies on lateralization. © 2013 Elsevier B.V.
Resumo:
We explored the brain's ability to quickly prevent a pre-potent but unwanted motor response. To address this, transcranial magnetic stimulation was delivered over the motor cortex (hand representation) to probe excitability changes immediately after somatosensory cues prompted subjects to either move as fast as possible or withhold movement. Our results showed a difference in motor cortical excitability 90 ms post-stimulus contingent on cues to either promote or prevent movement. We suggest that our study design emphasizing response speed coupled with well-defined early probes allowed us to extend upon similar past investigations into the timing of response inhibition.