935 resultados para Byrsonima basiloba extract


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This Australian Indigenous creactive work and its Treatise promote ways of thinking about practice and research that extend well beyond the current discourse. It invites re-thinking on how research can be practice-led in new ways, and what that might mean for future students. When discussing the challenges of today, this work signifies how "Western Style" thinking and theory is wanting in so many ways. It engages a new dynamic and innovative way of theorising, encouraging future students to apply their full capacity of energy and wisdom. (Extract from examiners' reports.)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The upstream oil & gas industry has been contending with massive data sets and monolithic files for many years, but “Big Data”—that is, the ability to apply more sophisticated types of analytical tools to information in a way that extracts new insights or creates new forms of value—is a relatively new concept that has the potential to significantly re-shape the industry. Despite the impressive amount of value that is being realized by Big Data technologies in other parts of the marketplace, however, much of the data collected within the oil & gas sector tends to be discarded, ignored, or analyzed in a very cursory way. This paper examines existing data management practices in the upstream oil & gas industry, and compares them to practices and philosophies that have emerged in organizations that are leading the Big Data revolution. The comparison shows that, in companies that are leading the Big Data revolution, data is regarded as a valuable asset. The presented evidence also shows, however, that this is usually not true within the oil & gas industry insofar as data is frequently regarded there as descriptive information about a physical asset rather than something that is valuable in and of itself. The paper then discusses how upstream oil & gas companies could potentially extract more value from data, and concludes with a series of specific technical and management-related recommendations to this end.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Contemporary cities no longer offer the same types of permanent environments that we planned for in the latter part of the twentieth century. Our public spaces are increasingly temporary, transient, and ephemeral. The theories, principles and tactics with which we designed these spaces in the past are no longer appropriate. We need a new theory for understanding the creation, use, and reuse of temporary public space. Moe than a theory, we need new architectural tactics or strategies that can be reliably employed to create successful temporary public spaces. This paper will present ongoing research that starts that process through critical review and technical analysis of existing and historic temporary public spaces. Through the analysis of a number of public spaces, that were either designed for temporary use or became temporary through changing social conditions, this research identifies the tactics and heuristics used in such projects. These tactics and heuristics are then analysed to extract some broader principles for the design of temporary public space. The theories of time related building layers, a model of environmental sustainability, and the recycling of social meaning, are all explored. The paper will go on to identify a number of key questions that need to be explored and addressed by a theory for such developments: How can we retain social meaning in the fabric of the city and its public spaces while we disassemble it and recycle it into new purposes? What role will preservation have in the rapidly changing future; will exemplary temporary spaces be preserved and thereby become no longer temporary? Does the environmental advantage of recycling materials, components and spaces outweigh the removal or social loss of temporary public space? This research starts to identify the knowledge gaps and proposes a number of strategies for making public space in the age of temporary, recyclable, and repurposing of our urban infrastructure; a way of creating lighter, cheaper, quicker, and temporary interventions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We report a more accurate method to determine the density of trap states in a polymer field-effect transistor. In the approach, we describe in this letter, we take into consideration the sub-threshold behavior in the calculation of the density of trap states. This is very important since the sub-threshold regime of operation extends to fairly large gate voltages in these disordered semiconductor based transistors. We employ the sub-threshold drift-limited mobility model (for sub-threshold response) and the conventional linear mobility model for above threshold response. The combined use of these two models allows us to extract the density of states from charge transport data much more accurately. We demonstrate our approach by analyzing data from diketopyrrolopyrrole based co-polymer transistors with high mobility. This approach will also work well for other disordered semiconductors in which sub-threshold conduction is important.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Accurate process model elicitation continues to be a time consuming task, requiring skill on the part of the interviewer to extract explicit and tacit process information from the interviewee. Many errors occur in this elicitation stage that would be avoided by better activity recall, more consistent specification methods and greater engagement in the elicitation process by interviewees. Metasonic GmbH has developed a process elicitation tool for their process suite. As part of a research engagement with Metasonic, staff from QUT, Australia have developed a 3D virtual world approach to the same problem, viz. eliciting process models from stakeholders in an intuitive manner. This book chapter tells the story of how QUT staff developed a 3D Virtual World tool for process elicitation, took the outcomes of their research project to Metasonic for evaluation, and finally, Metasonic’s response to the initial proof of concept.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Though popular, concepts such as Toffler's 'prosumer' (1970; 1980; 1990) are inherently limited in their ability to accurately describe the makeup and dynamics of current co-creative environments, from fundamentally non-profit initiatives like the Wikipedia to user-industry partnerships that engage in crowdsourcing and the development of collective intelligence. Instead, the success or failure of such projects can be understood best if the traditional producer/consumer divide is dissolved, allowing for the emergence of the produser (Bruns, 2008). A close investigation of leading spaces for produsage makes it possible to extract the key principles which underpin and guide such content co-creation, and to identify how innovative pro-am partnerships between commercial entities and user communities might be structured in order to maximise the benefits that both sides will be able to draw from such collaboration. This chapter will outline these principles, and point to successes and failures in applying them to pro- am initiatives.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The lateral amygdala (LA) receives information from auditory and visual sensory modalities, and uses this information to encode lasting memories that predict threat. One unresolved question about the amygdala is how multiple memories, derived from different sensory modalities, are organized at the level of neuronal ensembles. We previously showed that fear conditioning using an auditory conditioned stimulus (CS) was spatially allocated to a stable topography of neurons within the dorsolateral amygdala (LAd) (Bergstrom et al, 2011). Here, we asked how fear conditioning using a visual CS is topographically organized within the amygdala. To induce a lasting fear memory trace we paired either an auditory (2 khz, 55 dB, 20 s) or visual (1 Hz, 0.5 s on/0.5 s off, 35 lux, 20 s) CS with a mild foot shock unconditioned stimulus (0.6 mA, 0.5 s). To detect learning-induced plasticity in amygdala neurons, we used immunohistochemistry with an antibody for phosphorylated mitogen-activated protein kinase (pMAPK). Using a principal components analysis-based approach to extract and visualize spatial patterns, we uncovered two unique spatial patterns of activated neurons in the LA that were associated with auditory and visual fear conditioning. The first spatial pattern was specific to auditory cued fear conditioning and consisted of activated neurons topographically organized throughout the LAd and ventrolateral nuclei (LAvl) of the LA. The second spatial pattern overlapped for auditory and visual fear conditioning and was comprised of activated neurons located mainly within the LAvl. Overall, the density of pMAPK labeled cells throughout the LA was greatest in the auditory CS group, even though freezing in response to the visual and auditory CS was equivalent. There were no differences detected in the number of pMAPK activated neurons within the basal amygdala nuclei. Together, these results provide the first basic knowledge about the organizational structure of two different fear engrams within the amygdala and suggest they are dissociable at the level of neuronal ensembles within the LA

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Deliberate firesetting costs our community in destruction to property and lives. Public concern heightens when similar fires occur in a series, raising the specter of copycat firesetting. Difficulties associated with researching copycat crimes in general mean that not a lot is known about copycat firesetting. As an initial step toward filling this research gap, we explore connections between research on copycat crime and research into deliberate firesetting. The intention is to extract salient features from what is known about the phenomena of deliberate firesetting and copycat crime, map them together, and point out shared and unique characteristics. It is argued that a “copycat firesetter” is likely to exist as a distinct subgroup and potentially requiring targeted interventions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose the use of optical flow information as a method for detecting and describing changes in the environment, from the perspective of a mobile camera. We analyze the characteristics of the optical flow signal and demonstrate how robust flow vectors can be generated and used for the detection of depth discontinuities and appearance changes at key locations. To successfully achieve this task, a full discussion on camera positioning, distortion compensation, noise filtering, and parameter estimation is presented. We then extract statistical attributes from the flow signal to describe the location of the scene changes. We also employ clustering and dominant shape of vectors to increase the descriptiveness. Once a database of nodes (where a node is a detected scene change) and their corresponding flow features is created, matching can be performed whenever nodes are encountered, such that topological localization can be achieved. We retrieve the most likely node according to the Mahalanobis and Chi-square distances between the current frame and the database. The results illustrate the applicability of the technique for detecting and describing scene changes in diverse lighting conditions, considering indoor and outdoor environments and different robot platforms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The upstream oil and gas industry has been contending with massive data sets and monolithic files for many years, but “Big Data” is a relatively new concept that has the potential to significantly re-shape the industry. Despite the impressive amount of value that is being realized by Big Data technologies in other parts of the marketplace, however, much of the data collected within the oil and gas sector tends to be discarded, ignored, or analyzed in a very cursory way. This viewpoint examines existing data management practices in the upstream oil and gas industry, and compares them to practices and philosophies that have emerged in organizations that are leading the way in Big Data. The comparison shows that, in companies that are widely considered to be leaders in Big Data analytics, data is regarded as a valuable asset—but this is usually not true within the oil and gas industry insofar as data is frequently regarded there as descriptive information about a physical asset rather than something that is valuable in and of itself. The paper then discusses how the industry could potentially extract more value from data, and concludes with a series of policy-related questions to this end.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study focuses on trying to understand why the range of experience with respect to HIV infection is so diverse, especially as regards to the latency period. The challenge is to determine what assumptions can be made about the nature of the experience of antigenic invasion and diversity that can be modelled, tested and argued plausibly. To investigate this, an agent-based approach is used to extract high-level behaviour which cannot be described analytically from the set of interaction rules at the cellular level. A prototype model encompasses local variation in baseline properties contributing to the individual disease experience and is included in a network which mimics the chain of lymphatic nodes. Dealing with massively multi-agent systems requires major computational efforts. However, parallelisation methods are a natural consequence and advantage of the multi-agent approach. These are implemented using the MPI library.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the last few years, investigations of human epigenetic profiles have identified key elements of change to be Histone Modifications, stable and heritable DNA methylation and Chromatin remodeling. These factors determine gene expression levels and characterise conditions leading to disease. In order to extract information embedded in long DNA sequences, data mining and pattern recognition tools are widely used, but efforts have been limited to date with respect to analyzing epigenetic changes, and their role as catalysts in disease onset. Useful insight, however, can be gained by investigation of associated dinucleotide distributions. The focus of this paper is to explore specific dinucleotides frequencies across defined regions within the human genome, and to identify new patterns between epigenetic mechanisms and DNA content. Signal processing methods, including Fourier and Wavelet Transformations, are employed and principal results are reported.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background There is evidence that family and friends influence children's decisions to smoke. Objectives To assess the effectiveness of interventions to help families stop children starting smoking. Search methods We searched 14 electronic bibliographic databases, including the Cochrane Tobacco Addiction Group specialized register, MEDLINE, EMBASE, PsycINFO, CINAHL unpublished material, and key articles' reference lists. We performed free-text internet searches and targeted searches of appropriate websites, and hand-searched key journals not available electronically. We consulted authors and experts in the field. The most recent search was 3 April 2014. There were no date or language limitations. Selection criteria Randomised controlled trials (RCTs) of interventions with children (aged 5-12) or adolescents (aged 13-18) and families to deter tobacco use. The primary outcome was the effect of the intervention on the smoking status of children who reported no use of tobacco at baseline. Included trials had to report outcomes measured at least six months from the start of the intervention. Data collection and analysis We reviewed all potentially relevant citations and retrieved the full text to determine whether the study was an RCT and matched our inclusion criteria. Two authors independently extracted study data for each RCT and assessed them for risk of bias. We pooled risk ratios using a Mantel-Haenszel fixed effect model. Main results Twenty-seven RCTs were included. The interventions were very heterogeneous in the components of the family intervention, the other risk behaviours targeted alongside tobacco, the age of children at baseline and the length of follow-up. Two interventions were tested by two RCTs, one was tested by three RCTs and the remaining 20 distinct interventions were tested only by one RCT. Twenty-three interventions were tested in the USA, two in Europe, one in Australia and one in India. The control conditions fell into two main groups: no intervention or usual care; or school-based interventions provided to all participants. These two groups of studies were considered separately. Most studies had a judgement of 'unclear' for at least one risk of bias criteria, so the quality of evidence was downgraded to moderate. Although there was heterogeneity between studies there was little evidence of statistical heterogeneity in the results. We were unable to extract data from all studies in a format that allowed inclusion in a meta-analysis. There was moderate quality evidence family-based interventions had a positive impact on preventing smoking when compared to a no intervention control. Nine studies (4810 participants) reporting smoking uptake amongst baseline non-smokers could be pooled, but eight studies with about 5000 participants could not be pooled because of insufficient data. The pooled estimate detected a significant reduction in smoking behaviour in the intervention arms (risk ratio [RR] 0.76, 95% confidence interval [CI] 0.68 to 0.84). Most of these studies used intensive interventions. Estimates for the medium and low intensity subgroups were similar but confidence intervals were wide. Two studies in which some of the 4487 participants already had smoking experience at baseline did not detect evidence of effect (RR 1.04, 95% CI 0.93 to 1.17). Eight RCTs compared a combined family plus school intervention to a school intervention only. Of the three studies with data, two RCTS with outcomes for 2301 baseline never smokers detected evidence of an effect (RR 0.85, 95% CI 0.75 to 0.96) and one study with data for 1096 participants not restricted to never users at baseline also detected a benefit (RR 0.60, 95% CI 0.38 to 0.94). The other five studies with about 18,500 participants did not report data in a format allowing meta-analysis. One RCT also compared a family intervention to a school 'good behaviour' intervention and did not detect a difference between the two types of programme (RR 1.05, 95% CI 0.80 to 1.38, n = 388). No studies identified any adverse effects of intervention. Authors' conclusions There is moderate quality evidence to suggest that family-based interventions can have a positive effect on preventing children and adolescents from starting to smoke. There were more studies of high intensity programmes compared to a control group receiving no intervention, than there were for other compairsons. The evidence is therefore strongest for high intensity programmes used independently of school interventions. Programmes typically addressed family functioning, and were introduced when children were between 11 and 14 years old. Based on this moderate quality evidence a family intervention might reduce uptake or experimentation with smoking by between 16 and 32%. However, these findings should be interpreted cautiously because effect estimates could not include data from all studies. Our interpretation is that the common feature of the effective high intensity interventions was encouraging authoritative parenting (which is usually defined as showing strong interest in and care for the adolescent, often with rule setting). This is different from authoritarian parenting (do as I say) or neglectful or unsupervised parenting.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective To synthesise recent research on the use of machine learning approaches to mining textual injury surveillance data. Design Systematic review. Data sources The electronic databases which were searched included PubMed, Cinahl, Medline, Google Scholar, and Proquest. The bibliography of all relevant articles was examined and associated articles were identified using a snowballing technique. Selection criteria For inclusion, articles were required to meet the following criteria: (a) used a health-related database, (b) focused on injury-related cases, AND used machine learning approaches to analyse textual data. Methods The papers identified through the search were screened resulting in 16 papers selected for review. Articles were reviewed to describe the databases and methodology used, the strength and limitations of different techniques, and quality assurance approaches used. Due to heterogeneity between studies meta-analysis was not performed. Results Occupational injuries were the focus of half of the machine learning studies and the most common methods described were Bayesian probability or Bayesian network based methods to either predict injury categories or extract common injury scenarios. Models were evaluated through either comparison with gold standard data or content expert evaluation or statistical measures of quality. Machine learning was found to provide high precision and accuracy when predicting a small number of categories, was valuable for visualisation of injury patterns and prediction of future outcomes. However, difficulties related to generalizability, source data quality, complexity of models and integration of content and technical knowledge were discussed. Conclusions The use of narrative text for injury surveillance has grown in popularity, complexity and quality over recent years. With advances in data mining techniques, increased capacity for analysis of large databases, and involvement of computer scientists in the injury prevention field, along with more comprehensive use and description of quality assurance methods in text mining approaches, it is likely that we will see a continued growth and advancement in knowledge of text mining in the injury field.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The understanding of the load applied on the residuum through the prosthesis of individuals with transfemoral amputation (TFA) is essential to address a number of concerns that could strongly reduce their quality of life (e.g., residuum skin lesion, prosthesis fitting, alignment). This inner prosthesis loading could be estimated using a typical gait laboratory relying on inverse dynamics equations. Alternative, technological advances proposed over the last decade enabled direct measurement of this kinetic information in a broad variety of situations that could potentially be more relevant in clinical settings. The purposes of this presentation are (A) to review the literature about recent developments in measure and analyses of inner prosthesis loading of TFA, and (B) to extract information that could potentially contribute to a better evidence-based practice.