962 resultados para complex issues


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Childhood obesity is becoming a topical issue in both the health literature and the popular media and increasingly child health nurses are observing preschool children who appear to be disproportionately heavy for their height when plotted on standardised growth charts. In this paper literature related to childhood obesity in New Zealand and internationally is explored to identify current issues, and the implications of these issues for nurses in community based child health practice are discussed. Themes that emerged from the literature relate to the measurement of obesity, links between childhood and adult obesity and issues for families. A theme in the literature around maternal perception was of particular interest. Studies that investigated maternal perceptions of childhood obesity found that mothers identified their child as being overweight or obese only when it imposed limitations on physical activity or when the children were teased rather than by referring to individual growth graphs. The implications for nursing in the area of child health practice is discussed as nurses working in this area need an understanding of the complex and often emotive issues surrounding childhood obesity and an awareness of the reality of people's lives when devising health promotion strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We demonstrate that the process of generating smooth transitions Call be viewed as a natural result of the filtering operations implied in the generation of discrete-time series observations from the sampling of data from an underlying continuous time process that has undergone a process of structural change. In order to focus discussion, we utilize the problem of estimating the location of abrupt shifts in some simple time series models. This approach will permit its to address salient issues relating to distortions induced by the inherent aggregation associated with discrete-time sampling of continuous time processes experiencing structural change, We also address the issue of how time irreversible structures may be generated within the smooth transition processes. (c) 2005 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A theory of value sits at the core of every school of economic thought and directs the allocation of resources to competing uses. Ecological resources complicate the modem neoclassical approach to determining value due to their complex nature, considerable non-market values and the difficulty in assigning property rights. Application of the market model through economic valuation only provides analytical solutions based on virtual markets, and neither the demand nor supply-side techniques of valuation can adequately consider the complex set of biophysical and ecological relations that lead to the provision of ecosystem goods and services. This paper sets out a conceptual framework for a complex systems approach to the value of ecological resources. This approach is based on there being both an intrinsic quality of ecological resources and a subjective evaluation by the consumer. Both elements are necessary for economic value. This conceptual framework points the way towards a theory of value that incorporates both elements, so has implications for principles by which ecological resources can be allocated. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large amounts of information can be overwhelming and costly to process, especially when transmitting data over a network. A typical modern Geographical Information System (GIS) brings all types of data together based on the geographic component of the data and provides simple point-and-click query capabilities as well as complex analysis tools. Querying a Geographical Information System, however, can be prohibitively expensive due to the large amounts of data which may need to be processed. Since the use of GIS technology has grown dramatically in the past few years, there is now a need more than ever, to provide users with the fastest and least expensive query capabilities, especially since an approximated 80 % of data stored in corporate databases has a geographical component. However, not every application requires the same, high quality data for its processing. In this paper we address the issues of reducing the cost and response time of GIS queries by preaggregating data by compromising the data accuracy and precision. We present computational issues in generation of multi-level resolutions of spatial data and show that the problem of finding the best approximation for the given region and a real value function on this region, under a predictable error, in general is "NP-complete.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: The business process outsourcing (BPO) industry in India is evolving rapidly, and one of the key characteristics of this industry is the emergence of high-end services offered by knowledge processing outsourcing (KPO) organizations. These organizations are set to grow at a tremendous pace. Given the people-intensive nature of this industry, efficient employee management is bound to play a critical role. The literature lacks studies offering insights into the HR challenges involved and the ways in which they are addressed by KPOs. The purpose of this paper is to attempt to fill this gap by presenting findings from an in-depth case study of a KPO organization. Design/methodology/ approach: To achieve the research objective we adopted an in-depth case study approach. The research setting was that of a KPO organization in India, which specialises in offering complex analytics, accounting and support services to the real estate and financial services industries. Findings: The results of this study highlight the differences in the nature of work characteristics in such organizations as compared to call centres. The study also highlights some of the key people management challenges that these organizations face like attracting and retaining talent. The case company adopts formal, structured, transparent and innovative human resource practices. The study also highlights that such enlightened human resource practices stand on the foundations laid by an open work environment and facilitative leadership. Research limitations/implications: One of the key limitations is that the analysis is based on primary data from a single case study and only 18 interviews. The analysis contributes to the fields of KPO, HRM and India and has key messages for policy makers. Originality/value: The literature on outsourcing has in general focused on call centres established in the developed world. However, the booming BPO industry in India is also beginning to offer high-end services, which are far above the typical call centres. These KPOs and their people management challenges are relatively unexplored territories in the literature. By conducting this study in an emerging market (India) and focusing on people-related challenges in KPOs, this study attempts to provide a fresh perspective to the extant BPO literature. © Emerald Group Publishing Limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents an investigation, of synchronisation and causality, motivated by problems in computational neuroscience. The thesis addresses both theoretical and practical signal processing issues regarding the estimation of interdependence from a set of multivariate data generated by a complex underlying dynamical system. This topic is driven by a series of problems in neuroscience, which represents the principal background motive behind the material in this work. The underlying system is the human brain and the generative process of the data is based on modern electromagnetic neuroimaging methods . In this thesis, the underlying functional of the brain mechanisms are derived from the recent mathematical formalism of dynamical systems in complex networks. This is justified principally on the grounds of the complex hierarchical and multiscale nature of the brain and it offers new methods of analysis to model its emergent phenomena. A fundamental approach to study the neural activity is to investigate the connectivity pattern developed by the brain’s complex network. Three types of connectivity are important to study: 1) anatomical connectivity refering to the physical links forming the topology of the brain network; 2) effective connectivity concerning with the way the neural elements communicate with each other using the brain’s anatomical structure, through phenomena of synchronisation and information transfer; 3) functional connectivity, presenting an epistemic concept which alludes to the interdependence between data measured from the brain network. The main contribution of this thesis is to present, apply and discuss novel algorithms of functional connectivities, which are designed to extract different specific aspects of interaction between the underlying generators of the data. Firstly, a univariate statistic is developed to allow for indirect assessment of synchronisation in the local network from a single time series. This approach is useful in inferring the coupling as in a local cortical area as observed by a single measurement electrode. Secondly, different existing methods of phase synchronisation are considered from the perspective of experimental data analysis and inference of coupling from observed data. These methods are designed to address the estimation of medium to long range connectivity and their differences are particularly relevant in the context of volume conduction, that is known to produce spurious detections of connectivity. Finally, an asymmetric temporal metric is introduced in order to detect the direction of the coupling between different regions of the brain. The method developed in this thesis is based on a machine learning extensions of the well known concept of Granger causality. The thesis discussion is developed alongside examples of synthetic and experimental real data. The synthetic data are simulations of complex dynamical systems with the intention to mimic the behaviour of simple cortical neural assemblies. They are helpful to test the techniques developed in this thesis. The real datasets are provided to illustrate the problem of brain connectivity in the case of important neurological disorders such as Epilepsy and Parkinson’s disease. The methods of functional connectivity in this thesis are applied to intracranial EEG recordings in order to extract features, which characterize underlying spatiotemporal dynamics before during and after an epileptic seizure and predict seizure location and onset prior to conventional electrographic signs. The methodology is also applied to a MEG dataset containing healthy, Parkinson’s and dementia subjects with the scope of distinguishing patterns of pathological from physiological connectivity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Patients experience considerable difficulties in making and sustaining health-related lifestyle changes. Many Type 2 diabetes patients struggle to follow disease risk-management advice even when they receive extensive information and support. Drawing on a qualitative study of patients with Type 2 diabetes, the paper uses discourse analysis to examine their accounts about disease causation and disease management, and the implications for how they respond to their condition and health services advice. As it is a multifactorial disease, biomedical discourse around Type 2 diabetes is complex. Patients are encouraged to grasp the complicated message that both cause and medical outcomes related to their condition are partly, but not wholly, within their control. Discursive constructions identified from respondent accounts indicate how these two messages are deployed variously by respondents when accounting for disease causation and management. While these constructions (identified in respondent accounts as 'Up to me' and 'Down to them') are a valuable resource for patients, equally they may be deployed in a selective and detrimental way. We conclude that clear messages from health professionals about effective disease management may help patients to position themselves more effectively in relation to their condition. More importantly, they might serve to hinder the availability of inappropriate and potentially harmful patient positions where patients either relinquish responsibility for disease management or reject all input from health professionals. © The Author 2005. Published by Oxford University Press. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A multi-chromosome GA (Multi-GA) was developed, based upon concepts from the natural world, allowing improved flexibility in a number of areas including representation, genetic operators, their parameter rates and real world multi-dimensional applications. A series of experiments were conducted, comparing the performance of the Multi-GA to a traditional GA on a number of recognised and increasingly complex test optimisation surfaces, with promising results. Further experiments demonstrated the Multi-GA's flexibility through the use of non-binary chromosome representations and its applicability to dynamic parameterisation. A number of alternative and new methods of dynamic parameterisation were investigated, in addition to a new non-binary 'Quotient crossover' mechanism. Finally, the Multi-GA was applied to two real world problems, demonstrating its ability to handle mixed type chromosomes within an individual, the limited use of a chromosome level fitness function, the introduction of new genetic operators for structural self-adaptation and its viability as a serious real world analysis tool. The first problem involved optimum placement of computers within a building, allowing the Multi-GA to use multiple chromosomes with different type representations and different operators in a single individual. The second problem, commonly associated with Geographical Information Systems (GIS), required a spatial analysis location of the optimum number and distribution of retail sites over two different population grids. In applying the Multi-GA, two new genetic operators (addition and deletion) were developed and explored, resulting in the definition of a mechanism for self-modification of genetic material within the Multi-GA structure and a study of this behaviour.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of increasingly powerful computers, which has enabled the use of windowing software, has also opened the way for the computer study, via simulation, of very complex physical systems. In this study, the main issues related to the implementation of interactive simulations of complex systems are identified and discussed. Most existing simulators are closed in the sense that there is no access to the source code and, even if it were available, adaptation to interaction with other systems would require extensive code re-writing. This work aims to increase the flexibility of such software by developing a set of object-oriented simulation classes, which can be extended, by subclassing, at any level, i.e., at the problem domain, presentation or interaction levels. A strategy, which involves the use of an object-oriented framework, concurrent execution of several simulation modules, use of a networked windowing system and the re-use of existing software written in procedural languages, is proposed. A prototype tool which combines these techniques has been implemented and is presented. It allows the on-line definition of the configuration of the physical system and generates the appropriate graphical user interface. Simulation routines have been developed for the chemical recovery cycle of a paper pulp mill. The application, by creation of new classes, of the prototype to the interactive simulation of this physical system is described. Besides providing visual feedback, the resulting graphical user interface greatly simplifies the interaction with this set of simulation modules. This study shows that considerable benefits can be obtained by application of computer science concepts to the engineering domain, by helping domain experts to tailor interactive tools to suit their needs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ionising radiation hazards are perhaps the most documented and regulated occupational and environmental hazard. In the radiological protection field a single expert advisory organisation has had an unusually large influence on the international standard setting process. This is the International Commission on Radiological Protection (ICRP). Two common, and opposing views, exist over the formulation of protection recommendations by the ICRP. The first, and most widely accepted, is that its recommendations are scientifically determined. The second view, is that its recommendations are politically or socially determined. Neither of these analyses adequately accounts for the complex process in which protection recommendations are formulated. A third view, provided by studies of the origins of scientific controversy, suggests that both science and social factors are important in the assessment and limitation of risk. The aim of this thesis is not simply to examine the origin of controversy. Issues of equal, if not more, importance are the resolution of controversy, the formation of consensus and the maintenance of expert authority and influence. These issues form the central focus of this thesis. The aim is to assess the process through which the ICRP formulates its radiological protection recommendations and comment on the extent that these are influenced by the affiliations of its members. This thesis concludes that the ICRP's recommendations have been shaped by a complex relationship of scientific and social considerations, in which a socio-technical commitment to nuclear energy has played a key role. The Commission has responded to new scientific data by making complex changes to its philosophy and methods of describing risk. Where reductions in numerical limits have been applied they have been accompanied by practical measures designed to limit the impact of the change and provide continuity with the old limits and flexibility in the application of the new recommendations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is concerned with Organisational Problem Solving. The work reflects the complexities of organisational problem situations and the eclectic approach that has been necessary to gain an understanding of the processes involved. The thesis is structured into three main parts. Part I describes the author's understanding of problems and suitable approaches. Chapter 2 identifies the Transcendental Realist (TR) view of science (Harre 1970, Bhaskar 1975) as the best general framework for identifying suitable approaches to complex organisational problems. Chapter 3 discusses the relationship between Checkland's methodology (1972) and TR. The need to generate iconic (explanatory) models of the problem situation is identified and the ability of viable system modelling to supplement the modelling stage of the methodology is explored in Chapter 4. Chapter 5 builds further on the methodology to produce an original iconic model of the methodological process. The model characterises the mechanisms of organisational problem situations as well as desirable procedural steps. The Weltanschauungen (W's) or "world views" of key actors is recognised as central to the mechanisms involved. Part II describes the experience which prompted the theoretical investigation. Chapter 6 describes the first year of the project. The success of this stage is attributed to the predominance of a single W. Chapter 7 describes the changes in the organisation which made the remaining phase of the project difficult. These difficulties are attributed to a failure to recognise the importance of differing W's. Part III revisits the theoretical and organisational issues. Chapter 8 identifies a range of techniques embodying W's which are compatible with .the framework of Part I and which might usefully supplement it. Chapter 9 characterises possible W's in the sponsoring organisation. Throughout the work, an attempt 1s made to reflect the process as well as the product of the author's leaving.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation investigates the very important and current problem of modelling human expertise. This is an apparent issue in any computer system emulating human decision making. It is prominent in Clinical Decision Support Systems (CDSS) due to the complexity of the induction process and the vast number of parameters in most cases. Other issues such as human error and missing or incomplete data present further challenges. In this thesis, the Galatean Risk Screening Tool (GRiST) is used as an example of modelling clinical expertise and parameter elicitation. The tool is a mental health clinical record management system with a top layer of decision support capabilities. It is currently being deployed by several NHS mental health trusts across the UK. The aim of the research is to investigate the problem of parameter elicitation by inducing them from real clinical data rather than from the human experts who provided the decision model. The induced parameters provide an insight into both the data relationships and how experts make decisions themselves. The outcomes help further understand human decision making and, in particular, help GRiST provide more accurate emulations of risk judgements. Although the algorithms and methods presented in this dissertation are applied to GRiST, they can be adopted for other human knowledge engineering domains.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ontology construction for any domain is a labour intensive and complex process. Any methodology that can reduce the cost and increase efficiency has the potential to make a major impact in the life sciences. This paper describes an experiment in ontology construction from text for the animal behaviour domain. Our objective was to see how much could be done in a simple and relatively rapid manner using a corpus of journal papers. We used a sequence of pre-existing text processing steps, and here describe the different choices made to clean the input, to derive a set of terms and to structure those terms in a number of hierarchies. We describe some of the challenges, especially that of focusing the ontology appropriately given a starting point of a heterogeneous corpus. Results - Using mainly automated techniques, we were able to construct an 18055 term ontology-like structure with 73% recall of animal behaviour terms, but a precision of only 26%. We were able to clean unwanted terms from the nascent ontology using lexico-syntactic patterns that tested the validity of term inclusion within the ontology. We used the same technique to test for subsumption relationships between the remaining terms to add structure to the initially broad and shallow structure we generated. All outputs are available at http://thirlmere.aston.ac.uk/~kiffer/animalbehaviour/ webcite. Conclusion - We present a systematic method for the initial steps of ontology or structured vocabulary construction for scientific domains that requires limited human effort and can make a contribution both to ontology learning and maintenance. The method is useful both for the exploration of a scientific domain and as a stepping stone towards formally rigourous ontologies. The filtering of recognised terms from a heterogeneous corpus to focus upon those that are the topic of the ontology is identified to be one of the main challenges for research in ontology learning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ontology construction for any domain is a labour intensive and complex process. Any methodology that can reduce the cost and increase efficiency has the potential to make a major impact in the life sciences. This paper describes an experiment in ontology construction from text for the Animal Behaviour domain. Our objective was to see how much could be done in a simple and rapid manner using a corpus of journal papers. We used a sequence of text processing steps, and describe the different choices made to clean the input, to derive a set of terms and to structure those terms in a hierarchy. We were able in a very short space of time to construct a 17000 term ontology with a high percentage of suitable terms. We describe some of the challenges, especially that of focusing the ontology appropriately given a starting point of a heterogeneous corpus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Society depends on complex IT systems created by integrating and orchestrating independently managed systems. The incredible increase in scale and complexity in them over the past decade means new software-engineering techniques are needed to help us cope with their inherent complexity. The key characteristic of these systems is that they are assembled from other systems that are independently controlled and managed. While there is increasing awareness in the software engineering community of related issues, the most relevant background work comes from systems engineering. The interacting algos that led to the Flash Crash represent an example of a coalition of systems, serving the purposes of their owners and cooperating only because they have to. The owners of the individual systems were competing finance companies that were often mutually hostile. Each system jealously guarded its own information and could change without consulting any other system.