985 resultados para Scientists,


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main aim of radiotherapy is to deliver a dose of radiation that is high enough to destroy the tumour cells while at the same time minimising the damage to normal healthy tissues. Clinically, this has been achieved by assigning a prescription dose to the tumour volume and a set of dose constraints on critical structures. Once an optimal treatment plan has been achieved the dosimetry is assessed using the physical parameters of dose and volume. There has been an interest in using radiobiological parameters to evaluate and predict the outcome of a treatment plan in terms of both a tumour control probability (TCP) and a normal tissue complication probability (NTCP). In this study, simple radiobiological models that are available in a commercial treatment planning system were used to compare three dimensional conformal radiotherapy treatments (3D-CRT) and intensity modulated radiotherapy (IMRT) treatments of the prostate. Initially both 3D-CRT and IMRT were planned for 2 Gy/fraction to a total dose of 60 Gy to the prostate. The sensitivity of the TCP and the NTCP to both conventional dose escalation and hypo-fractionation was investigated. The biological responses were calculated using the Källman S-model. The complication free tumour control probability (P+) is generated from the combined NTCP and TCP response values. It has been suggested that the alpha/beta ratio for prostate carcinoma cells may be lower than for most other tumour cell types. The effect of this on the modelled biological response for the different fractionation schedules was also investigated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background The problem of silent multiple comparisons is one of the most difficult statistical problems faced by scientists. It is a particular problem for investigating a one-off cancer cluster reported to a health department because any one of hundreds, or possibly thousands, of neighbourhoods, schools, or workplaces could have reported a cluster, which could have been for any one of several types of cancer or any one of several time periods. Methods This paper contrasts the frequentist approach with a Bayesian approach for dealing with silent multiple comparisons in the context of a one-off cluster reported to a health department. Two published cluster investigations were re-analysed using the Dunn-Sidak method to adjust frequentist p-values and confidence intervals for silent multiple comparisons. Bayesian methods were based on the Gamma distribution. Results Bayesian analysis with non-informative priors produced results similar to the frequentist analysis, and suggested that both clusters represented a statistical excess. In the frequentist framework, the statistical significance of both clusters was extremely sensitive to the number of silent multiple comparisons, which can only ever be a subjective "guesstimate". The Bayesian approach is also subjective: whether there is an apparent statistical excess depends on the specified prior. Conclusion In cluster investigations, the frequentist approach is just as subjective as the Bayesian approach, but the Bayesian approach is less ambitious in that it treats the analysis as a synthesis of data and personal judgements (possibly poor ones), rather than objective reality. Bayesian analysis is (arguably) a useful tool to support complicated decision-making, because it makes the uncertainty associated with silent multiple comparisons explicit.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Searching academic databases for records on ‘business failure’, ‘business distress’ or ‘bankruptcy’ yields a large body of studies on qualitative, empirical, theoretical and simulation aspects. It is a central part of this research to distil from this large quantity of potentially relevant reports and methodologies those which can both flag and predict business failure in the construction industry. An additional search term, such as, ‘construction’, ‘construction industry’ or ‘contractor’ yields a much smaller number of hits, many of which emphasize the construction industry’s distinctive characteristics. We scientists need first to understand the subject of investigation and the environment in which it lives. To do so, an analysis of existing successful and failed approaches to particular research questions is helpful before embarking on new territory. This guides the structure of the following report for we first review papers that specifically report on aspects of business failure in the construction industry followed by, (a) an overview of promising candidates borrowed from other disciplines and industries, and (b) a possible novel approach. An Australian (Queensland) perspective on the topic will also drive this investigation as most of the published research has been applied to the US and UK construction industries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper draws on a study of gender and politics in the Australian parliament in order to make a contribution to methodological debates in feminist political science. The paper begins by outlining the different dimensions of feminist political science methodology that have been identified in the literature. According to this literature five key principles can be seen to constitute feminist approaches to political science. These are: a focus on gender, a deconstruction of the public/private divide, giving voice to women, using research as a basis for transformation, and using reflexivity to critique researcher positionality. The next part of the paper focuses more specifically on reflexivity tracing arguments about its definition, usefulness and the criticisms it has attracted from researchers. Following this, I explore how my background as a member of the Australian House of Representatives from 1987 to 1996 provided an important academic resource in my doctoral study of gender and politics in the national parliament. Through this process I highlight the value of a reflexive approach to research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article attempts to explore the concept of scientific community at the macro-national level in the context of Iran. Institutionalisation of science and its professional growth has been constrained by several factors. The article first conceptualises the notion of science community as found in the literature in the context of Iran, and attempts to map through some indicators. The main focus, however, lies in mapping some institutional problems through empirical research. This was undertaken in 2002–04 in order to analyse the structure of the scientific community in Iran in the ‘exact sciences’ (mathematics, physics, chemistry, biology and earth sciences). The empirical work was done in two complementary perspectives: through a questionnaire and statistical analysis of it, and through semistructured interviews with the researchers. There are number of problems confronting scientists in Iran. Facilities provided by institutions is one of the major problems of research. Another is the tenuous cooperation among scientists. This is reported by most of the researchers, who deplore the lack of cooperation among their group. Relationships are mostly with the Ph.D. students and only marginally with colleagues. Our research shows that the more brilliant the scientists, the more frustrated they are from scientific institutions in Iran. Medium-range researchers seem to be much happier about the scientific institution to which they belong than the brighter scholars. The scientific institutions in Iran seem to be built for the needs of the former rather than the latter. These institutions seem not to play a positive role in the case of the best scientists. On the whole, many ingredients of the scientific community, at least at its inception, are present among Iranian scientists: the strong desire for scientific achievement in spite of personal, institutional and economic problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the study of complex neurobiological movement systems, measurement indeterminacy has typically been overcome by imposing artificial modelling constraints to reduce the number of unknowns (e.g., reducing all muscle, bone and ligament forces crossing a joint to a single vector). However, this approach prevents human movement scientists from investigating more fully the role, functionality and ubiquity of coordinative structures or functional motor synergies. Advancements in measurement methods and analysis techniques are required if the contribution of individual component parts or degrees of freedom of these task-specific structural units is to be established, thereby effectively solving the indeterminacy problem by reducing the number of unknowns. A further benefit of establishing more of the unknowns is that human movement scientists will be able to gain greater insight into ubiquitous processes of physical self-organising that underpin the formation of coordinative structures and the confluence of organismic, environmental and task constraints that determine the exact morphology of these special-purpose devices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

These Proceedings, arising from the 2008 World Dance Alliance Global Summit, reflect both its spirit and diversity, re-appraising what dance is and might be in the 21st century. Through 53 papers from 14 countries in the Americas, Europe and the Asia-Pacific region, the authors — ranging from seasoned scholars to emerging artists publishing for the first time — span the perspectives of academics, educators, performance and community artists, health professionals and cognitive scientists; predominantly from dance but also from film, visual arts, science, performance and philosophy. The papers are grouped under the five Summit themes: Re-thinking the way we make Dance; Re-thinking the way we teach Dance; Mind/body connections; Transcultural conversations and Sustainability

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It can probably be said those people who work in the world of the natural sciences— chemistry, biology, mathematics, and so on—regard themselves as inhabiting a fairly straight-forward epistemological world. They have a body of knowledge to draw upon from within any given area, much like a set of bricks in a wall, and when they conduct their experiments, they add another brick to that wall … and the wall gets taller, and they know more than they did before. Such scientists would contend that the questions they ask as part of their research flow naturally from the very nature of the world itself. But even then, they could also argue that the specifics of the question ultimately don‟t really matter that much, because if enough people work on any given brick, the truth will eventually emerge anyway.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The need for large scale environmental monitoring to manage environmental change is well established. Ecologists have long used acoustics as a means of monitoring the environment in their field work, and so the value of an acoustic environmental observatory is evident. However, the volume of data generated by such an observatory would quickly overwhelm even the most fervent scientist using traditional methods. In this paper we present our steps towards realising a complete acoustic environmental observatory - i.e. a cohesive set of hardware sensors, management utilities, and analytical tools required for large scale environmental monitoring. Concrete examples of these elements, which are in active use by ecological scientists, are also presented

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Science has been under attack in the last thirty years, and recently a number of prominent scientists have been busy fighting back. Here, an argument is presented that the `science wars' stem from an unreasonably strict adherence to the reductive method on the part of science, but that weakening this stance need not imply a lapse into subjectivity. One possible method for formalising the description of non-separable, contextually dependent complex systems is presented. This is based upon a quantum-like approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cultural objects are increasingly generated and stored in digital form, yet effective methods for their indexing and retrieval still remain an important area of research. The main problem arises from the disconnection between the content-based indexing approach used by computer scientists and the description-based approach used by information scientists. There is also a lack of representational schemes that allow the alignment of the semantics and context with keywords and low-level features that can be automatically extracted from the content of these cultural objects. This paper presents an integrated approach to address these problems, taking advantage of both computer science and information science approaches. We firstly discuss the requirements from a number of perspectives: users, content providers, content managers and technical systems. We then present an overview of our system architecture and describe various techniques which underlie the major components of the system. These include: automatic object category detection; user-driven tagging; metadata transform and augmentation, and an expression language for digital cultural objects. In addition, we discuss our experience on testing and evaluating some existing collections, analyse the difficulties encountered and propose ways to address these problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis details methodology to estimate urban stormwater quality based on a set of easy to measure physico-chemical parameters. These parameters can be used as surrogate parameters to estimate other key water quality parameters. The key pollutants considered in this study are nitrogen compounds, phosphorus compounds and solids. The use of surrogate parameter relationships to evaluate urban stormwater quality will reduce the cost of monitoring and so that scientists will have added capability to generate a large amount of data for more rigorous analysis of key urban stormwater quality processes, namely, pollutant build-up and wash-off. This in turn will assist in the development of more stringent stormwater quality mitigation strategies. The research methodology was based on a series of field investigations, laboratory testing and data analysis. Field investigations were conducted to collect pollutant build-up and wash-off samples from residential roads and roof surfaces. Past research has identified that these impervious surfaces are the primary pollutant sources to urban stormwater runoff. A specially designed vacuum system and rainfall simulator were used in the collection of pollutant build-up and wash-off samples. The collected samples were tested for a range of physico-chemical parameters. Data analysis was conducted using both univariate and multivariate data analysis techniques. Analysis of build-up samples showed that pollutant loads accumulated on road surfaces are higher compared to the pollutant loads on roof surfaces. Furthermore, it was found that the fraction of solids smaller than 150 ìm is the most polluted particle size fraction in solids build-up on both roads and roof surfaces. The analysis of wash-off data confirmed that the simulated wash-off process adopted for this research agrees well with the general understanding of the wash-off process on urban impervious surfaces. The observed pollutant concentrations in wash-off from road surfaces were different to pollutant concentrations in wash-off from roof surfaces. Therefore, firstly, the identification of surrogate parameters was undertaken separately for roads and roof surfaces. Secondly, a common set of surrogate parameter relationships were identified for both surfaces together to evaluate urban stormwater quality. Surrogate parameters were identified for nitrogen, phosphorus and solids separately. Electrical conductivity (EC), total organic carbon (TOC), dissolved organic carbon (DOC), total suspended solids (TSS), total dissolved solids (TDS), total solids (TS) and turbidity (TTU) were selected as the relatively easy to measure parameters. Consequently, surrogate parameters for nitrogen and phosphorus were identified from the set of easy to measure parameters for both road surfaces and roof surfaces. Additionally, surrogate parameters for TSS, TDS and TS which are key indicators of solids were obtained from EC and TTU which can be direct field measurements. The regression relationships which were developed for surrogate parameters and key parameter of interest were of a similar format for road and roof surfaces, namely it was in the form of simple linear regression equations. The identified relationships for road surfaces were DTN-TDS:DOC, TP-TS:TOC, TSS-TTU, TDS-EC and TSTTU: EC. The identified relationships for roof surfaces were DTN-TDS and TSTTU: EC. Some of the relationships developed had a higher confidence interval whilst others had a relatively low confidence interval. The relationships obtained for DTN-TDS, DTN-DOC, TP-TS and TS-EC for road surfaces demonstrated good near site portability potential. Currently, best management practices are focussed on providing treatment measures for stormwater runoff at catchment outlets where separation of road and roof runoff is not found. In this context, it is important to find a common set of surrogate parameter relationships for road surfaces and roof surfaces to evaluate urban stormwater quality. Consequently DTN-TDS, TS-EC and TS-TTU relationships were identified as the common relationships which are capable of providing measurements of DTN and TS irrespective of the surface type.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Weber's contribution on Protestant work ethic has stimulated numerous social scientists. However, the question whether a Protestant specific work ethic exists at all is still rarely analysed. Our results indicate that work ethic is influenced by denomination-based religiosity and education.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Agriculture accounts for a significant portion of the GDP in most developed countries. However, managing farms, particularly largescale extensive farming systems, is hindered by lack of data and increasing shortage of labour. We have deployed a large heterogeneous sensor network on a working farm to explore sensor network applications that can address some of the issues identified above. Our network is solar powered and has been running for over 6 months. The current deployment consists of over 40 moisture sensors that provide soil moisture profiles at varying depths, weight sensors to compute the amount of food and water consumed by animals, electronic tag readers, up to 40 sensors that can be used to track animal movement (consisting of GPS, compass and accelerometers), and 20 sensor/actuators that can be used to apply different stimuli (audio, vibration and mild electric shock) to the animal. The static part of the network is designed for 24/7 operation and is linked to the Internet via a dedicated high-gain radio link, also solar powered. The initial goals of the deployment are to provide a testbed for sensor network research in programmability and data handling while also being a vital tool for scientists to study animal behavior. Our longer term aim is to create a management system that completely transforms the way farms are managed.