74 resultados para Components phenols of CNSL

em Queensland University of Technology - ePrints Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bone defect treatments can be augmented by mesenchymal stem cell (MSC) based therapies. MSC interaction with the extracellular matrix (ECM) of the surrounding tissue regulates their functional behavior. Understanding of these specific regulatory mechanisms is essential for the therapeutic stimulation of MSC in vivo. However, these interactions are presently only partially understood. This study examined in parallel, for the first time, the effects on the functional behavior of MSCs of 13 ECM components from bone, cartilage and hematoma compared to a control protein, and hence draws conclusions for rational biomaterial design. ECM components specifically modulated MSC adhesion, migration, proliferation, and osteogenic differentiation, for example, fibronectin facilitated migration, adhesion, and proliferation, but not osteogenic differentiation, whereas fibrinogen enhanced adhesion and proliferation, but not migration. Subsequently, the integrin expression pattern of MSCs was determined and related to the cell behavior on specific ECM components. Finally, on this basis, peptide sequences are reported for the potential stimulation of MSC functions. Based on the results of this study, ECM component coatings could be designed to specifically guide cell functions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The contribution of risky behaviour to the increased crash and fatality rates of young novice drivers is recognised in the road safety literature around the world. Exploring such risky driver behaviour has led to the development of tools like the Driver Behaviour Questionnaire (DBQ) to examine driving violations, errors, and lapses [1]. Whilst the DBQ has been utilised in young novice driver research, some items within this tool seem specifically designed for the older, more experienced driver, whilst others appear to asses both behaviour and related motives. The current study was prompted by the need for a risky behaviour measurement tool that can be utilised with young drivers with a provisional driving licence. Sixty-three items exploring young driver risky behaviour developed from the road safety literature were incorporated into an online survey. These items assessed driver, passenger, journey, car and crash-related issues. A sample of 476 drivers aged 17-25 years (M = 19, SD = 1.59 years) with a provisional driving licence and matched for age, gender, and education were drawn from a state-wide sample of 761 young drivers who completed the survey. Factor analysis based upon a principal components extraction of factors was followed by an oblique rotation to investigate the underlying dimensions to young novice driver risky behaviour. A five factor solution comprising 44 items was identified, accounting for 55% of the variance in young driver risky behaviour. Factor 1 accounted for 32.5% of the variance and appeared to measure driving violations that were transient in nature - risky behaviours that followed risky decisions that occurred during the journey (e.g., speeding). Factor 2 accounted for 10.0% of variance and appeared to measure driving violations that were fixed in nature; the risky decisions being undertaken before the journey (e.g., drink driving). Factor 3 accounted for 5.4% of variance and appeared to measure misjudgment (e.g., misjudged speed of oncoming vehicle). Factor 4 accounted for 4.3% of variance and appeared to measure risky driving exposure (e.g., driving at night with friends as passengers). Factor 5 accounted for 2.8% of variance and appeared to measure driver emotions or mood (e.g., anger). Given that the aim of the study was to create a research tool, the factors informed the development of five subscales and one composite scale. The composite scale had a very high internal consistency measure (Cronbach’s alpha) of .947. Self-reported data relating to police-detected driving offences, their crash involvement, and their intentions to break road rules within the next year were also collected. While the composite scale was only weakly correlated with self-reported crashes (r = .16, p < .001), it was moderately correlated with offences (r = .26, p < .001), and highly correlated with their intentions to break the road rules (r = .57, p < .001). Further application of the developed scale is needed to confirm the factor structure within other samples of young drivers both in Australia and in other countries. In addition, future research could explore the applicability of the scale for investigating the behaviour of other types of drivers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Proteasomes are cylindrical particles made up of a stack of four heptameric rings. In animal cells the outer rings are made up of 7 different types of alpha subunits and the inner rings are composed of 7 out of 10 possible different beta subunits. Regulatory complexes can bind to the ends of the cylinder.We have investigated aspects of the assembly, activity and subunit composition of core proteasome particles and 26S proteasomes, the localization of proteasome subpopulations, and the possible role of phosphorylation in determining proteasome localization, activities and association with regulatory components.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper considers VECMs for variables exhibiting cointegration and common features in the transitory components. While the presence of cointegration between the permanent components of series reduces the rank of the long-run multiplier matrix, a common feature among the transitory components leads to a rank reduction in the matrix summarizing short-run dynamics. The common feature also implies that there exists linear combinations of the first-differenced variables in a cointegrated VAR that are white noise and traditional tests focus on testing for this characteristic. An alternative, however, is to test the rank of the short-run dynamics matrix directly. Consequently, we use the literature on testing the rank of a matrix to produce some alternative test statistics. We also show that these are identical to one of the traditional tests. The performance of the different methods is illustrated in a Monte Carlo analysis which is then used to re-examine an existing empirical study. Finally, this approach is applied to provide a check for the presence of common dynamics in DSGE models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The 12 to 13 July 2003 andesite lava dome collapse at the Soufrière Hills volcano, Montserrat, provides the first opportunity to document comprehensively both the sub-aerial and submarine sequence of events for an eruption. Numerous pyroclastic flows entered the ocean during the collapse, depositing approximately 90% of the total material into the submarine environment. During peak collapse conditions, as the main flow penetrated the air–ocean interface, phreatic explosions were observed and a surge cloud decoupled from the main flow body to travel 2 to 3 km over the ocean surface before settling. The bulk of the flow was submerged and rapidly mixed with sea water forming a water-saturated mass flow. Efficient sorting and physical differentiation occurred within the flow before initial deposition at 500 m water depth. The coarsest components (∼60% of the total volume) were deposited proximally from a dense granular flow, while the finer components (∼40%) were efficiently elutriated into the overlying part of the flow, which evolved into a far-reaching turbidity current.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When wheels pass over insulated rail joints (IRJs) a vertical impact force is generated. The ability to measure the impact force is valuable as the force signature helps understand the behaviour of the IRJs, in particular their potential for failure. The impact forces are thought to be one of the main factors that cause damage to the IRJ and track components. Study of the deterioration mechanism helps finding new methods to improve the service life of IRJs in track. In this research, the strain-gage-based wheel load detector, for the first time, is employed to measure the wheel–rail contact-impact force at an IRJ in a heavy haul rail line. In this technique, the strain gages are installed within the IRJ assembly without disturbing the structural integrity of IRJ and arranged in a full wheatstone bridge to form a wheel load detector. The instrumented IRJ is first tested and calibrated in the lab and then installed in the field. For comparison purposes, a reference rail section is also instrumented with the same strain gage pattern as the IRJ. In this paper the measurement technique, the process of instrumentation, and tests as well as some typical data obtained from the field and the inferences are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increases in functionality, power and intelligence of modern engineered systems led to complex systems with a large number of interconnected dynamic subsystems. In such machines, faults in one subsystem can cascade and affect the behavior of numerous other subsystems. This complicates the traditional fault monitoring procedures because of the need to train models of the faults that the monitoring system needs to detect and recognize. Unavoidable design defects, quality variations and different usage patterns make it infeasible to foresee all possible faults, resulting in limited diagnostic coverage that can only deal with previously anticipated and modeled failures. This leads to missed detections and costly blind swapping of acceptable components because of one’s inability to accurately isolate the source of previously unseen anomalies. To circumvent these difficulties, a new paradigm for diagnostic systems is proposed and discussed in this paper. Its feasibility is demonstrated through application examples in automotive engine diagnostics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To understand the underlying genetic architecture of cardiovascular disease (CVD) risk traits, we undertook a genome-wide linkage scan to identify CVD quantitative trait loci (QTLs) in 377 individuals from the Norfolk Island population. The central aim of this research focused on the utilization of a genetically and geographically isolated population of individuals from Norfolk Island for the purposes of variance component linkage analysis to identify QTLs involved in CVD risk traits. Substantial evidence supports the involvement of traits such as systolic and diastolic blood pressures, high-density lipoprotein-cholesterol, low-density lipoprotein-cholesterol, body mass index and triglycerides as important risk factors for CVD pathogenesis. In addition to the environmental inXuences of poor diet, reduced physical activity, increasing age, cigarette smoking and alcohol consumption, many studies have illustrated a strong involvement of genetic components in the CVD phenotype through family and twin studies. We undertook a genome scan using 400 markers spaced approximately 10 cM in 600 individuals from Norfolk Island. Genotype data was analyzed using the variance components methods of SOLAR. Our results gave a peak LOD score of 2.01 localizing to chromosome 1p36 for systolic blood pressure and replicated previously implicated loci for other CVD relevant QTLs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Service-oriented Architectures, business processes can be realized by composing loosely coupled services. The problem of QoS-aware service composition is widely recognized in the literature. Existing approaches on computing an optimal solution to this problem tackle structured business processes, i.e., business processes which are composed of XOR-block, AND-block, and repeat loop orchestration components. As of yet, OR-block and unstructured orchestration components have not been sufficiently considered in the context of QoS-aware service composition. The work at hand addresses this shortcoming. An approach for computing an optimal solution to the service composition problem is proposed considering the structured orchestration components, such as AND/XOR/OR-block and repeat loop, as well as unstructured orchestration components.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The two-year trial of the Queensland minimum passing distance (MPD) road rule began on 7 April 2014. The rule requires motor vehicles to provide cyclists a minimum lateral passing distance of one metre when overtaking cyclists in a speed zone of 60 km/h or less, and 1.5 metres when the speed limit is greater than 60 km/h. This document summarises the evaluation of the effectiveness of the new rule in terms of its: 1. practical implementation; 2. impact on road users’ attitudes and perceptions; and 3. road safety benefits. The Centre for Accident Research and Road Safety – Queensland (CARRS-Q) developed the evaluation framework (Haworth, Schramm, Kiata-Holland, Vallmuur, Watson & Debnath; 2014) for the Queensland Department of Transport and Main Roads (TMR) and was later commissioned to undertake the evaluation. The evaluation included the following components: • Review of correspondence received by TMR; • Interviews and focus groups with Queensland Police Service (QPS) officers; • Road user survey; • Observational study; and • Crash, injury and infringement data analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is unclear which theoretical dimension of psychological stress affects health status. We hypothesized that both distress and coping mediate the relationship between socio-economic position and tooth loss. Cross-sectional data from 2915 middle-aged adults evaluated retention of < 20 teeth, behaviors, psychological stress, and sociodemographic characteristics. Principal components analysis of the Perceived Stress Scale (PSS) extracted 'distress' (a = 0.85) and 'coping' (a =0.83) factors, consistent with theory. Hierarchical entry of explanatory variables into age- and sex-adjusted logistic regression models estimated odds ratios (OR) and 95% confidence intervals [95% CI] for retention of < 20 teeth. Analysis of the separate contributions of distress and coping revealed a significant main effect of coping (OR = 0.7 [95% CI = 0.7-0.8]), but no effect for distress (OR = 1.0 [95% CI = 0.9-1.1]) or for the interaction of coping and distress. Behavior and psychological stress only modestly attenuated socio-economic inequality in retention of < 20 teeth, providing evidence to support a mediating role of coping.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Key topics: Since the birth of the Open Source movement in the mid-80's, open source software has become more and more widespread. Amongst others, the Linux operating system, the Apache web server and the Firefox internet explorer have taken substantial market shares to their proprietary competitors. Open source software is governed by particular types of licenses. As proprietary licenses only allow the software's use in exchange for a fee, open source licenses grant users more rights like the free use, free copy, free modification and free distribution of the software, as well as free access to the source code. This new phenomenon has raised many managerial questions: organizational issues related to the system of governance that underlie such open source communities (Raymond, 1999a; Lerner and Tirole, 2002; Lee and Cole 2003; Mockus et al. 2000; Tuomi, 2000; Demil and Lecocq, 2006; O'Mahony and Ferraro, 2007;Fleming and Waguespack, 2007), collaborative innovation issues (Von Hippel, 2003; Von Krogh et al., 2003; Von Hippel and Von Krogh, 2003; Dahlander, 2005; Osterloh, 2007; David, 2008), issues related to the nature as well as the motivations of developers (Lerner and Tirole, 2002; Hertel, 2003; Dahlander and McKelvey, 2005; Jeppesen and Frederiksen, 2006), public policy and innovation issues (Jullien and Zimmermann, 2005; Lee, 2006), technological competitions issues related to standard battles between proprietary and open source software (Bonaccorsi and Rossi, 2003; Bonaccorsi et al. 2004, Economides and Katsamakas, 2005; Chen, 2007), intellectual property rights and licensing issues (Laat 2005; Lerner and Tirole, 2005; Gambardella, 2006; Determann et al., 2007). A major unresolved issue concerns open source business models and revenue capture, given that open source licenses imply no fee for users. On this topic, articles show that a commercial activity based on open source software is possible, as they describe different possible ways of doing business around open source (Raymond, 1999; Dahlander, 2004; Daffara, 2007; Bonaccorsi and Merito, 2007). These studies usually look at open source-based companies. Open source-based companies encompass a wide range of firms with different categories of activities: providers of packaged open source solutions, IT Services&Software Engineering firms and open source software publishers. However, business models implications are different for each of these categories: providers of packaged solutions and IT Services&Software Engineering firms' activities are based on software developed outside their boundaries, whereas commercial software publishers sponsor the development of the open source software. This paper focuses on open source software publishers' business models as this issue is even more crucial for this category of firms which take the risk of investing in the development of the software. Literature at last identifies and depicts only two generic types of business models for open source software publishers: the business models of ''bundling'' (Pal and Madanmohan, 2002; Dahlander 2004) and the dual licensing business models (Välimäki, 2003; Comino and Manenti, 2007). Nevertheless, these business models are not applicable in all circumstances. Methodology: The objectives of this paper are: (1) to explore in which contexts the two generic business models described in literature can be implemented successfully and (2) to depict an additional business model for open source software publishers which can be used in a different context. To do so, this paper draws upon an explorative case study of IdealX, a French open source security software publisher. This case study consists in a series of 3 interviews conducted between February 2005 and April 2006 with the co-founder and the business manager. It aims at depicting the process of IdealX's search for the appropriate business model between its creation in 2000 and 2006. This software publisher has tried both generic types of open source software publishers' business models before designing its own. Consequently, through IdealX's trials and errors, I investigate the conditions under which such generic business models can be effective. Moreover, this study describes the business model finally designed and adopted by IdealX: an additional open source software publisher's business model based on the principle of ''mutualisation'', which is applicable in a different context. Results and implications: Finally, this article contributes to ongoing empirical work within entrepreneurship and strategic management on open source software publishers' business models: it provides the characteristics of three generic business models (the business model of bundling, the dual licensing business model and the business model of mutualisation) as well as conditions under which they can be successfully implemented (regarding the type of product developed and the competencies of the firm). This paper also goes further into the traditional concept of business model used by scholars in the open source related literature. In this article, a business model is not only considered as a way of generating incomes (''revenue model'' (Amit and Zott, 2001)), but rather as the necessary conjunction of value creation and value capture, according to the recent literature about business models (Amit and Zott, 2001; Chresbrough and Rosenblum, 2002; Teece, 2007). Consequently, this paper analyses the business models from these two components' point of view.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective Theoretical models of post-traumatic growth (PTG) have been derived in the general trauma literature to describe the post-trauma experience that facilitates the perception of positive life changes. To develop a statistical model identifying factors that are associated with PTG, structural equation modelling (SEM) was used in the current study to assess the relationships between perception of diagnosis severity, rumination, social support, distress, and PTG. Method A statistical model of PTG was tested in a sample of participants diagnosed with a variety of cancers (N=313). Results An initial principal components analysis of the measure used to assess rumination revealed three components: intrusive rumination, deliberate rumination of benefits, and life purpose rumination. SEM results indicated that the model fit the data well and that 30% of the variance in PTG was explained by the variables. Trauma severity was directly related to distress, but not to PTG. Deliberately ruminating on benefits and social support were directly related to PTG. Life purpose rumination and intrusive rumination were associated with distress. Conclusions The model showed that in addition to having unique correlating factors, distress was not related to PTG, thereby providing support for the notion that these are discrete constructs in the post-diagnosis experience. The statistical model provides support that post-diagnosis experience is simultaneously shaped by positive and negative life changes and that one or the other outcome may be prevalent or may occur concurrently. As such, an implication for practice is the need for supportive care that is holistic in nature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

- ​Covers entire research process from start to end - Places particular emphasis on motivational components, modes of inquiry in scholarly conduct, theorizing and planning research - Includes aspects such as publication and ethical challenges This book is designed to introduce doctoral and other higher-degree research students to the process of scientific research in the fields of Information Systems as well as fields of Information Technology, Business Process Management and other related disciplines within the social sciences. It guides research students in their process of learning the life of a researcher. In doing so, it provides an understanding of the essential elements, concepts and challenges of the journey into research studies. It also provides a gateway for the student to inquire deeper about each element covered​. Comprehensive and broad but also succinct and compact, the book is focusing on the key principles and challenges for a novice doctoral student.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An onboard payload may be seen in most instances as the “Raison d’Etre” for a UAV. It will define its capabilities, usability and hence market value. Large and medium UAV payloads exhibit significant differences in size and computing capability when compared with small UAVs. The latter have stringent size, weight, and power requirements, typically referred as SWaP, while the former still exhibit endless appetite for compute capability. The tendency for this type of UAVs (Global Hawk, Hunter, Fire Scout, etc.) is to increase payload density and hence processing capability. An example of this approach is the Northrop Grumman MQ-8 Fire Scout helicopter, which has a modular payload architecture that incorporates off-the-shelf components. Regardless of the UAV size and capabilities, advances in miniaturization of electronics are enabling the replacement of multiprocessing, power-hungry general-purpose processors for more integrated and compact electronics (e.g., FPGAs). Payloads play a significant role in the quality of ISR (intelligent, surveillance, and reconnaissance) data, and also in how quick that information can be delivered to the end user. At a high level, payloads are important enablers of greater mission autonomy, which is the ultimate aim in every UAV. This section describes common payload sensors and introduces two examples cases in which onboard payloads were used to solve real-world problems. A collision avoidance payload based on electro optical (EO) sensors is first introduced, followed by a remote sensing application for power line inspection and vegetation management.