97 resultados para Approach to CSR development
Resumo:
Pervasive computing applications must be engineered to provide unprecedented levels of flexibility in order to reconfigure and adapt in response to changes in computing resources and user requirements. To meet these challenges, appropriate software engineering abstractions and infrastructure are required as a platform on which to build adaptive applications. In this paper, we demonstrate the use of a disciplined, model-based approach to engineer a context-aware Session Initiation Protocol (SIP) based communication application. This disciplined approach builds on our previously developed conceptual models and infrastructural components, which enable the description, acquisition, management and exploitation of arbitrary types of context and user preference information to enable adaptation to context changes
Resumo:
A major challenge in successfully implementing transit-oriented development (TOD) is having a robust process that ensures effective appraisal, initiation and delivery of multi-stakeholder TOD projects. A step-by step project development process can assist in the methodic design, evaluation, and initiation of TOD projects. Successful TOD requires attention to transit, mixed-use development and public space. Brisbane, Australia provides a case-study where recent planning policies and infrastructure documents have laid a foundation for TOD, but where barriers lie in precinct level planning and project implementation. In this context and perhaps in others, the research effort needs to shift toward identification of appropriate project processes and strategies. This paper presents the outcomes of research conducted to date. Drawing on the mainstream approach to project development and financial evaluation for property projects, key steps for potential use in successful delivery of TOD projects have been identified, including: establish the framework; location selection; precinct context review; preliminary precinct design; the initial financial viability study; the decision stage; establishment of project structure; land acquisition; development application; and project delivery. The appropriateness of this mainstream development and appraisal process will be tested through stakeholder research, and the proposed process will then be refined for adoption in TOD projects. It is suggested that the criteria for successful TOD should be broadened beyond financial concerns in order to deliver public sector support for project initiation.
Resumo:
Background: Tissue Doppler may be used to quantify regional left ventricular function but is limited by segmental variation of longitudinal velocity from base to apex and free to septal walls. We sought to overcome this by developing a composite of longitudinal and radial velocities. Methods and Results. We examined 82 unselected patients undergoing a standard dobutamine echocardiogram. Longitudinal velocity was obtained in the basal and mid segments of each wall using tissue Doppler in the apical views. Radial velocities were derived in the same segments using an automated border detection system and centerline method with regional chords grouped according to segment location and temporally averaged. In 25 patients at low probability of coronary disease, the pattern of regional variation in longitudinal velocity (higher in the septum) was the opposite of radial velocity (higher in the free wall) and the combination was homogenous. In 57 patients undergoing angiography, velocity in abnormal segments was less than normal segments using longitudinal (6.0 +/- 3.6 vs 9.0 +/- 2.2 cm/s, P = .01) and radial velocity (6.0 +/- 4.0 vs 8.0 +/- 3.9 cm/s, P = .02). However, the composite velocity permitted better separation of abnormal and normal segments (13.3 +/- 5.6 vs 17.5 +/- 4.2 cm/s, P = .001). There was no significant difference between the accuracy of this quantitative approach and expert visual wall motion analysis (81% vs 84%, P = .56). Conclusion: Regional variation of uni-dimensional myocardial velocities necessitates site-specific normal ranges, probably because of different fiber directions. Combined analysis of longitudinal and radial velocities allows the derivation of a composite velocity, which is homogenous in all segments and may allow better separation of normal and abnormal myocardium.
Resumo:
We have piloted a monthly series of multidisciplinary case discussions via videoconference in the area of child development. The project provided a forum for clinical discussion of complex cases, peer review, professional development and networking for allied health professionals and paediatricians. Six sites in Queensland participated in the project; each site presented at least one case for discussion. The videoconferences ran for 90 min each and were attended by an average of 26 health professionals. The response rate for a questionnaire survey was 71%. The respondents rated the effectiveness of case summaries and the follow-up newsletter very positively. Despite some early difficulties with the technical aspects of videoconferencing, the evaluation demonstrated the participants' satisfaction with the project and its relevance to their everyday practice.
Resumo:
The development of large-scale solid-stale fermentation (SSF) processes is hampered by the lack of simple tools for the design of SSF bioreactors. The use of semifundamental mathematical models to design and operate SSF bioreactors can be complex. In this work, dimensionless design factors are used to predict the effects of scale and of operational variables on the performance of rotating drum bioreactors. The dimensionless design factor (DDF) is a ratio of the rate of heat generation to the rate of heat removal at the time of peak heat production. It can be used to predict maximum temperatures reached within the substrate bed for given operational variables. Alternatively, given the maximum temperature that can be tolerated during the fermentation, it can be used to explore the combinations of operating variables that prevent that temperature from being exceeded. Comparison of the predictions of the DDF approach with literature data for operation of rotating drums suggests that the DDF is a useful tool. The DDF approach was used to explore the consequences of three scale-up strategies on the required air flow rates and maximum temperatures achieved in the substrate bed as the bioreactor size was increased on the basis of geometric similarity. The first of these strategies was to maintain the superficial flow rate of the process air through the drum constant. The second was to maintain the ratio of volumes of air per volume of bioreactor constant. The third strategy was to adjust the air flow rate with increase in scale in such a manner as to maintain constant the maximum temperature attained in the substrate bed during the fermentation. (C) 2000 John Wiley & Sons, Inc.
Resumo:
Taking functional programming to its extremities in search of simplicity still requires integration with other development (e.g. formal) methods. Induction is the key to deriving and verifying functional programs, but can be simplified through packaging proofs with functions, particularly folds, on data (structures). Totally Functional Programming avoids the complexities of interpretation by directly representing data (structures) as platonic combinators - the functions characteristic to the data. The link between the two simplifications is that platonic combinators are a kind of partially-applied fold, which means that platonic combinators inherit fold-theoretic properties, but with some apparent simplifications due to the platonic combinator representation. However, despite observable behaviour within functional programming that suggests that TFP is widely-applicable, significant work remains before TFP as such could be widely adopted.
Resumo:
Teen Triple P is a multilevel system of intervention that is designed to provide parents with specific strategies to promote the positive development of their teenage children as they make the transition into high school and through puberty. The program is based on a combination of education about the developmental needs of adolescents, skills training to improve communication and problem-solving, plus specific modules to deal with common problems encountered by parents and adolescents that can escalate into major conflict and violence. It is designed to increase the engagement of parents of adolescent and pre-adolescent children by providing them with easy access to evidencebased parenting advice and support. This paper presents data collected as part of a survey of over 1400 students in first year high school at 9 Brisbane schools. The survey instrument was constructed to obtain students' reports about behaviour which is known to be associated with their health and wellbeing, and also on the extent to which their parents promoted or discouraged such behaviour at home, at school, and in their social and recreational activities in the wider community. Selected data from the survey were extracted and presented to parents at a series of parenting seminars held at the schools to promote appropriate parenting of teenagers. The objectives were to provide parents with accurate data about teenagers' behaviour, and about teenagers' reports of how they perceived their parents' behaviour. Normative data on parent and teenager behaviour will be presented from the survey as well as psychometric data relating to the reliability and validity of this new measure. Implications of this strategy for increasing parent engagement in parenting programs that aim to reduce behavioural and emotional problems in adolescents will be discussed.
Resumo:
Activated sludge models are used extensively in the study of wastewater treatment processes. While various commercial implementations of these models are available, there are many people who need to code models themselves using the simulation packages available to them, Quality assurance of such models is difficult. While benchmarking problems have been developed and are available, the comparison of simulation data with that of commercial models leads only to the detection, not the isolation of errors. To identify the errors in the code is time-consuming. In this paper, we address the problem by developing a systematic and largely automated approach to the isolation of coding errors. There are three steps: firstly, possible errors are classified according to their place in the model structure and a feature matrix is established for each class of errors. Secondly, an observer is designed to generate residuals, such that each class of errors imposes a subspace, spanned by its feature matrix, on the residuals. Finally. localising the residuals in a subspace isolates coding errors. The algorithm proved capable of rapidly and reliably isolating a variety of single and simultaneous errors in a case study using the ASM 1 activated sludge model. In this paper a newly coded model was verified against a known implementation. The method is also applicable to simultaneous verification of any two independent implementations, hence is useful in commercial model development.