950 resultados para Search Strategies


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Survey results provide a preliminary assessment of the relative contribution of a range of tactical business strategies to innovation performance by firms in the Australian construction industry. Over 1,300 firms were surveyed in 2004, resulting in a response rate of 29%. Respondents were classified as high, medium or low innovators according to an innovation index based on the novelty and impact of their innovations and their adoption of listed technological and organizational advances. The relative significance of 23 business strategies concerning (1) employees, (2) marketing, (3) technology, (4) knowledge and (5) relationships was examined by determining the extent to which they distinguished high innovators from low innovators. The individual business strategies that most strongly distinguished high innovators were (1) ‘investing in R&D’, (2) ‘participating in partnering and alliances on projects’, (3) ‘ensuring project learnings are transferred into continuous business processes’, (4) ‘monitoring international best practice’, and (5) ‘recruiting new graduates’. Of the five types of strategies assessed, marketing strategies were the least significant in supporting innovation. The results provide practical guidance to managers in project-based industries wishing to improve their innovation performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Damage localization induced by strain softening can be predicted by the direct minimization of a global energy function. This article concerns the computational strategy for implementing this principle for softening materials such as concrete. Instead of using heuristic global optimization techniques, our strategies are a hybrid of local optimization methods with a path-finding approach to ensure a global optimum. With admissible nodal displacements being independent variables, it is easy to deal with the geometric (mesh) constraint conditions. The direct search optimization methods recover the localized solutions for a range of softening lattice models which are representative of quasi-brittle structures

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Currently, well-established clinical therapeutic approaches for bone reconstruction are restricted to the transplantation of autografts and allografts, and the implantation of metal devices or ceramic-based implants to assist bone regeneration. Bone grafts possess osteoconductive and osteoinductive properties, however they are limited in access and availability and associated with donor site morbidity, haemorrhage, risk of infection, insufficient transplant integration, graft devitalisation, and subsequent resorption resulting in decreased mechanical stability. As a result, recent research focuses on the development of alternative therapeutic concepts. The field of tissue engineering has emerged as an important approach to bone regeneration. However, bench to bedside translations are still infrequent as the process towards approval by regulatory bodies is protracted and costly, requiring both comprehensive in vitro and in vivo studies. The subsequent gap between research and clinical translation, hence commercialization, is referred to as the ‘Valley of Death’ and describes a large number of projects and/or ventures that are ceased due to a lack of funding during the transition from product/technology development to regulatory approval and subsequently commercialization. One of the greatest difficulties in bridging the Valley of Death is to develop good manufacturing processes (GMP) and scalable designs and to apply these in pre-clinical studies. In this article, we describe part of the rationale and road map of how our multidisciplinary research team has approached the first steps to translate orthopaedic bone engineering from bench to bedside byestablishing a pre-clinical ovine critical-sized tibial segmental bone defect model and discuss our preliminary data relating to this decisive step.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Overloading truck traffic is an untenable problem around the world. The occurrence of overloaded truck traffic can be evidence of rapid development of an economy. Most of the developing countries emphasize the development of economy, thus supporting reform of infrastructure is limited. This research investigates the relationship between truck overloading and the condition of road damage. The objective of this research is to determine the amount of economic loss due to overloaded truck traffic is. Axle load will be used to calculate the total ESAL to pavement. This study intends to provide perspective on the relationship between change in axle load due to overloading and the resultant service life of pavement. It can then be used in the estimation of pavement damage in other developing countries facing the problem of truck overloading. In conclusion, economical loss was found, which include reduction of pavement life and increase in maintenance and rehabilitation (M&R) cost. As a result, net present value (NPV) of pavement investment with overloading truck traffic is higher than normal truck traffic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Brand loyalty is a concept that has garnered considerable interest over recent years from both marketing practitioners and academics alike. While marketers are primarily interested in ways they can generate and increase brand loyalty from their customers, academics strive to conducts research which investigates the antecedents and consequences of customer loyalty (See DeWitt, Nguyen and Marshall 2008; Russell-Bennett, McColl-Kennedy and Coote 2007).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we use time series analysis to evaluate predictive scenarios using search engine transactional logs. Our goal is to develop models for the analysis of searchers’ behaviors over time and investigate if time series analysis is a valid method for predicting relationships between searcher actions. Time series analysis is a method often used to understand the underlying characteristics of temporal data in order to make forecasts. In this study, we used a Web search engine transactional log and time series analysis to investigate users’ actions. We conducted our analysis in two phases. In the initial phase, we employed a basic analysis and found that 10% of searchers clicked on sponsored links. However, from 22:00 to 24:00, searchers almost exclusively clicked on the organic links, with almost no clicks on sponsored links. In the second and more extensive phase, we used a one-step prediction time series analysis method along with a transfer function method. The period rarely affects navigational and transactional queries, while rates for transactional queries vary during different periods. Our results show that the average length of a searcher session is approximately 2.9 interactions and that this average is consistent across time periods. Most importantly, our findings shows that searchers who submit the shortest queries (i.e., in number of terms) click on highest ranked results. We discuss implications, including predictive value, and future research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Here we search for evidence of the existence of a sub-chondritic 142Nd/144Nd reservoir that balances the Nd isotope chemistry of the Earth relative to chondrites. If present, it may reside in the source region of deeply sourced mantle plume material. We suggest that lavas from Hawai’i with coupled elevations in 186Os/188Os and 187Os/188Os, from Iceland that represent mixing of upper mantle and lower mantle components, and from Gough with sub-chondritic 143Nd/144Nd and high 207Pb/206Pb, are favorable samples that could reflect mantle sources that have interacted with an Early-Enriched Reservoir (EER) with sub-chondritic 142Nd/144Nd. High-precision Nd isotope analyses of basalts from Hawai’i, Iceland and Gough demonstrate no discernable 142Nd/144Nd deviation from terrestrial standards. These data are consistent with previous high-precision Nd isotope analysis of recent mantle-derived samples and demonstrate that no mantle-derived material to date provides evidence for the existence of an EER in the mantle. We then evaluate mass balance in the Earth with respect to both 142Nd/144Nd and 143Nd/144Nd. The Nd isotope systematics of EERs are modeled for different sizes and timing of formation relative to ε143Nd estimates of the reservoirs in the μ142Nd = 0 Earth, where μ142Nd is ((measured 142Nd/144Nd/terrestrial standard 142Nd/144Nd)−1 * 10−6) and the μ142Nd = 0 Earth is the proportion of the silicate Earth with 142Nd/144Nd indistinguishable from the terrestrial standard. The models indicate that it is not possible to balance the Earth with respect to both 142Nd/144Nd and 143Nd/144Nd unless the μ142Nd = 0 Earth has a ε143Nd within error of the present-day Depleted Mid-ocean ridge basalt Mantle source (DMM). The 4567 Myr age 142Nd–143Nd isochron for the Earth intersects μ142Nd = 0 at ε143Nd of +8 ± 2 providing a minimum ε143Nd for the μ142Nd = 0 Earth. The high ε143Nd of the μ142Nd = 0 Earth is confirmed by the Nd isotope systematics of Archean mantle-derived rocks that consistently have positive ε143Nd. If the EER formed early after solar system formation (0–70 Ma) continental crust and DMM can be complementary reservoirs with respect to Nd isotopes, with no requirement for significant additional reservoirs. If the EER formed after 70 Ma then the μ142Nd = 0 Earth must have a bulk ε143Nd more radiogenic than DMM and additional high ε143Nd material is required to balance the Nd isotope systematics of the Earth.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reports preliminary results from a study modeling the interplay between multitasking, cognitive coordination, and cognitive shifts during Web search. Study participants conducted three Web searches on personal information problems. Data collection techniques included pre- and post-search questionnaires; think-aloud protocols, Web search logs, observation, and post-search interviews. Key findings include: (1) users Web searches included multitasking, cognitive shifting and cognitive coordination processes, (2) cognitive coordination is the hinge linking multitasking and cognitive shifting that enables Web search construction, (3) cognitive shift levels determine the process of cognitive coordination, and (4) cognitive coordination is interplay of task, mechanism and strategy levels that underpin multitasking and task switching. An initial model depicts the interplay between multitasking, cognitive coordination, and cognitive shifts during Web search. Implications of the findings and further research are also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is a big challenge to clearly identify the boundary between positive and negative streams. Several attempts have used negative feedback to solve this challenge; however, there are two issues for using negative relevance feedback to improve the effectiveness of information filtering. The first one is how to select constructive negative samples in order to reduce the space of negative documents. The second issue is how to decide noisy extracted features that should be updated based on the selected negative samples. This paper proposes a pattern mining based approach to select some offenders from the negative documents, where an offender can be used to reduce the side effects of noisy features. It also classifies extracted features (i.e., terms) into three categories: positive specific terms, general terms, and negative specific terms. In this way, multiple revising strategies can be used to update extracted features. An iterative learning algorithm is also proposed to implement this approach on RCV1, and substantial experiments show that the proposed approach achieves encouraging performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper compares the performances of two different optimisation techniques for solving inverse problems; the first one deals with the Hierarchical Asynchronous Parallel Evolutionary Algorithms software (HAPEA) and the second is implemented with a game strategy named Nash-EA. The HAPEA software is based on a hierarchical topology and asynchronous parallel computation. The Nash-EA methodology is introduced as a distributed virtual game and consists of splitting the wing design variables - aerofoil sections - supervised by players optimising their own strategy. The HAPEA and Nash-EA software methodologies are applied to a single objective aerodynamic ONERA M6 wing reconstruction. Numerical results from the two approaches are compared in terms of the quality of model and computational expense and demonstrate the superiority of the distributed Nash-EA methodology in a parallel environment for a similar design quality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The issue of what an effective high quality / high equity education system might look like remains contested. Indeed there is more educational commentary on those systems that do not achieve this goal (see for example Luke & Woods, 2009 for a detailed review of the No Child Left Behind policy initiatives put forward in the United States under the Bush Administration) than there is detailed consideration of what such a system might enact and represent. A long held critique of socio cultural and critical perspectives in education has been their focus on deconstruction to the supposed detriment of reconstructive work. This critique is less warranted in recent times based on work in the field, especially the plethora of qualitative research focusing on case studies of ‘best practice’. However it certainly remains the case that there is more work to be done in investigating the characteristics of a socially just system. This issue of Point and Counterpoint aims to progress such a discussion. Several of the authors call for a reconfiguration of the use of large scale comparative assessment measures and all suggest new ways of thinking about quality and equity for school systems. Each of the papers tackles different aspects of the problematic of how to achieve high equity without compromising quality within a large education system. They each take a reconstructive focus, highlighting ways forward for education systems in Australia and beyond. While each paper investigates different aspects of the issue, the clearly stated objective of seeking to delineate and articulate characteristics of socially just education is consistent throughout the issue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The novel manuscript Girl in the Shadows tells the story of two teenage girls whose friendship, safety and sanity are pushed to the limits when an unexplained phenomenon invades their lives. Sixteen-year-old Tash has everything a teenage girl could want: good looks, brains and freedom from her busy parents. But when she looks into her mirror, a stranger’s face stares back at her. Her best friend Mal believes it’s an evil spirit and enters the world of the supernatural to find answers. But spell books and ouija boards cannot fix a problem that comes from deep within the soul. It will take a journey to the edge of madness for Tash to face the truth inside her heart and see the evil that lurks in her home. And Mal’s love and courage to pull her back into life. The exegesis examines resilience and coping strategies in adolescence, in particular, the relationship of trauma to brain development in children and teenagers. It draws on recent discoveries in neuroscience and psychology to provide a framework to examine the role of coping strategies in building resilience. Within this broader context, it analyses two works of contemporary young adult fiction, Freaky Green Eyes by Joyce Carol Oates and Sonya Hartnett’s Surrender, their use of the split persona as a coping mechanism within young adult fiction and the potential of young adult literature as a tool to help build resilience in teen readers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To summarise the extent to which narrative text fields in administrative health data are used to gather information about the event resulting in presentation to a health care provider for treatment of an injury, and to highlight best practise approaches to conducting narrative text interrogation for injury surveillance purposes.----- Design: Systematic review----- Data sources: Electronic databases searched included CINAHL, Google Scholar, Medline, Proquest, PubMed and PubMed Central.. Snowballing strategies were employed by searching the bibliographies of retrieved references to identify relevant associated articles.----- Selection criteria: Papers were selected if the study used a health-related database and if the study objectives were to a) use text field to identify injury cases or use text fields to extract additional information on injury circumstances not available from coded data or b) use text fields to assess accuracy of coded data fields for injury-related cases or c) describe methods/approaches for extracting injury information from text fields.----- Methods: The papers identified through the search were independently screened by two authors for inclusion, resulting in 41 papers selected for review. Due to heterogeneity between studies metaanalysis was not performed.----- Results: The majority of papers reviewed focused on describing injury epidemiology trends using coded data and text fields to supplement coded data (28 papers), with these studies demonstrating the value of text data for providing more specific information beyond what had been coded to enable case selection or provide circumstantial information. Caveats were expressed in terms of the consistency and completeness of recording of text information resulting in underestimates when using these data. Four coding validation papers were reviewed with these studies showing the utility of text data for validating and checking the accuracy of coded data. Seven studies (9 papers) described methods for interrogating injury text fields for systematic extraction of information, with a combination of manual and semi-automated methods used to refine and develop algorithms for extraction and classification of coded data from text. Quality assurance approaches to assessing the robustness of the methods for extracting text data was only discussed in 8 of the epidemiology papers, and 1 of the coding validation papers. All of the text interrogation methodology papers described systematic approaches to ensuring the quality of the approach.----- Conclusions: Manual review and coding approaches, text search methods, and statistical tools have been utilised to extract data from narrative text and translate it into useable, detailed injury event information. These techniques can and have been applied to administrative datasets to identify specific injury types and add value to previously coded injury datasets. Only a few studies thoroughly described the methods which were used for text mining and less than half of the studies which were reviewed used/described quality assurance methods for ensuring the robustness of the approach. New techniques utilising semi-automated computerised approaches and Bayesian/clustering statistical methods offer the potential to further develop and standardise the analysis of narrative text for injury surveillance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: The objectives of this article are to explore the extent to which the International Statistical Classification of Diseases and Related Health Problems (ICD) has been used in child abuse research, to describe how the ICD system has been applied and to assess factors affecting the reliability of ICD coded data in child abuse research.----- Methods: PubMed, CINAHL, PsychInfo and Google Scholar were searched for peer reviewed articles written since 1989 that used ICD as the classification system to identify cases and research child abuse using health databases. Snowballing strategies were also employed by searching the bibliographies of retrieved references to identify relevant associated articles. The papers identified through the search were independently screened by two authors for inclusion, resulting in 47 studies selected for the review. Due to heterogeneity of studies metaanalysis was not performed.----- Results: This paper highlights both utility and limitations of ICD coded data. ICD codes have been widely used to conduct research into child maltreatment in health data systems. The codes appear to be used primarily to determine child maltreatment patterns within identified diagnoses or to identify child maltreatment cases for research.----- Conclusions: A significant impediment to the use of ICD codes in child maltreatment research is the under-ascertainment of child maltreatment by using coded data alone. This is most clearly identified and, to some degree, quantified, in research where data linkage is used. Practice Implications: The importance of improved child maltreatment identification will assist in identifying risk factors and creating programs that can prevent and treat child maltreatment and assist in meeting reporting obligations under the CRC.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To date, most theories of business models have theorized value capture assuming that appropriability regimes were exogenous and that the firm would face a unique, ideal-typical appropriability regime. This has led theory contributions to focus on governance structures to minimize transaction costs, to downplay the interdepencies between value capture and value creation, and to ignore revenue generation strategies. We propose a reconceptualization of business models value capture mechanisms that rely on assumptions of endogeneity and multiplicity of appropriability regimes. This new approach to business model construction highlights the interdependencies and trade-offs between value creation and value capture offered by different types and combinations of appropriability regimes. The theory is illustrated by the analysis of three cases of open source software business models