945 resultados para Complex needs
Resumo:
Management (or perceived mismanagement) of large-scale, complex projects poses special problems and often results in spectacular failures, cost overruns, time blowouts and stakeholder dissatisfaction. While traditional project management responds with increasingly administrative constraints, we argue that leaders of such projects also need to display adaptive and enabling behaviours to foster adaptive processes, such as opportunity recognition, which requires an interaction of cognitive and affective processes of individual, project, and team leader attributes and behaviours. At the core of this model we propose is an interaction of cognitive flexibility, affect and emotional intelligence. The result of this interaction is enhanced leader opportunity recognition that, in turn, facilitates multilevel outcomes.
Resumo:
In this paper, we develop a conceptual model to explore the perceived complementary congruence between complex project leaders and the demands of the complex project environment to understand how leaders’ affective and behavioural performance at work might be impacted by this fit. We propose that complex project leaders high in emotional intelligence and cognitive flexibility should report a higher level of fit between themselves and the complex project environment. This abilities-demands measure of fit should then relate to affective and behavioural performance outcomes, such that leaders who perceive a higher level of fit should establish and maintain more effective, higher quality project stakeholder relationships than leaders who perceive a lower level of fit.
Resumo:
Mixture models are a flexible tool for unsupervised clustering that have found popularity in a vast array of research areas. In studies of medicine, the use of mixtures holds the potential to greatly enhance our understanding of patient responses through the identification of clinically meaningful clusters that, given the complexity of many data sources, may otherwise by intangible. Furthermore, when developed in the Bayesian framework, mixture models provide a natural means for capturing and propagating uncertainty in different aspects of a clustering solution, arguably resulting in richer analyses of the population under study. This thesis aims to investigate the use of Bayesian mixture models in analysing varied and detailed sources of patient information collected in the study of complex disease. The first aim of this thesis is to showcase the flexibility of mixture models in modelling markedly different types of data. In particular, we examine three common variants on the mixture model, namely, finite mixtures, Dirichlet Process mixtures and hidden Markov models. Beyond the development and application of these models to different sources of data, this thesis also focuses on modelling different aspects relating to uncertainty in clustering. Examples of clustering uncertainty considered are uncertainty in a patient’s true cluster membership and accounting for uncertainty in the true number of clusters present. Finally, this thesis aims to address and propose solutions to the task of comparing clustering solutions, whether this be comparing patients or observations assigned to different subgroups or comparing clustering solutions over multiple datasets. To address these aims, we consider a case study in Parkinson’s disease (PD), a complex and commonly diagnosed neurodegenerative disorder. In particular, two commonly collected sources of patient information are considered. The first source of data are on symptoms associated with PD, recorded using the Unified Parkinson’s Disease Rating Scale (UPDRS) and constitutes the first half of this thesis. The second half of this thesis is dedicated to the analysis of microelectrode recordings collected during Deep Brain Stimulation (DBS), a popular palliative treatment for advanced PD. Analysis of this second source of data centers on the problems of unsupervised detection and sorting of action potentials or "spikes" in recordings of multiple cell activity, providing valuable information on real time neural activity in the brain.
Resumo:
Eukaryotic cell cycle progression is mediated by phosphorylation of protein substrates by cyclin-dependent kinases (CDKs). A critical substrate of CDKs is the product of the retinoblastoma tumor suppressor gene, pRb, which inhibits G1-S phase cell cycle progression by binding and repressing E2F transcription factors. CDK-mediated phosphorylation of pRb alleviates this inhibitory effect to promote G1-S phase cell cycle progression. pRb represses transcription by binding to the E2F transactivation domain and recruiting the mSin3·histone deacetylase (HDAC) transcriptional repressor complex via the retinoblastoma-binding protein 1 (RBP1). RBP1 binds to the pocket region of pRb via an LXCXE motif and to the SAP30 subunit of the mSin3·HDAC complex and, thus, acts as a bridging protein in this multisubunit complex. In the present study we identified RBP1 as a novel CDK substrate. RBP1 is phosphorylated by CDK2 on serines 864 and 1007, which are N- and C-terminal to the LXCXE motif, respectively. CDK2-mediated phosphorylation of RBP1 or pRb destabilizes their interaction in vitro, with concurrent phosphorylation of both proteins leading to their dissociation. Consistent with these findings, RBP1 phosphorylation is increased during progression from G 1 into S-phase, with a concurrent decrease in its association with pRb in MCF-7 breast cancer cells. These studies provide new mechanistic insights into CDK-mediated regulation of the pRb tumor suppressor during cell cycle progression, demonstrating that CDK-mediated phosphorylation of both RBP1 and pRb induces their dissociation to mediate release of the mSin3·HDAC transcriptional repressor complex from pRb to alleviate transcriptional repression of E2F.
Resumo:
Kinematic models are commonly used to quantify foot and ankle kinematics, yet no marker sets or models have been proven reliable or accurate when wearing shoes. Further, the minimal detectable difference of a developed model is often not reported. We present a kinematic model that is reliable, accurate and sensitive to describe the kinematics of the foot–shoe complex and lower leg during walking gait. In order to achieve this, a new marker set was established, consisting of 25 markers applied on the shoe and skin surface, which informed a four segment kinematic model of the foot–shoe complex and lower leg. Three independent experiments were conducted to determine the reliability, accuracy and minimal detectable difference of the marker set and model. Inter-rater reliability of marker placement on the shoe was proven to be good to excellent (ICC = 0.75–0.98) indicating that markers could be applied reliably between raters. Intra-rater reliability was better for the experienced rater (ICC = 0.68–0.99) than the inexperienced rater (ICC = 0.38–0.97). The accuracy of marker placement along each axis was <6.7 mm for all markers studied. Minimal detectable difference (MDD90) thresholds were defined for each joint; tibiocalcaneal joint – MDD90 = 2.17–9.36°, tarsometatarsal joint – MDD90 = 1.03–9.29° and the metatarsophalangeal joint – MDD90 = 1.75–9.12°. These thresholds proposed are specific for the description of shod motion, and can be used in future research designed at comparing between different footwear.
Resumo:
Flow-oriented process modeling languages have a long tradition in the area of Business Process Management and are widely used for capturing activities with their behavioral and data dependencies. Individual events were introduced for triggering process instantiation and activities. However, real-world business cases drive the need for also covering complex event patterns as they are known in the field of Complex Event Processing. Therefore, this paper puts forward a catalog of requirements for handling complex events in process models, which can be used as reference framework for assessing process definition languages and systems. An assessment of BPEL and BPMN is provided.
Resumo:
This article investigates the complex phenomenon of major gift giving to charitable institutions. Drawing on empirical evidence from interviews with 16 Australian major donors (who gave a single gift of at least AU$10,000 in 2008 or 2009), we seek to better understand donor expectations and (dis)satisfaction. Given growing need for social services, and the competition among nonprofit organisations (NPOs) to secure sustainable funding, this research is particularly timely. Currently, little is known about major donors’ expectations, wants and needs. Equity theory, with the concept of reciprocity at its core, was found to provide a useful framework for understanding these phenomena. A model of equitable major gift relationships was developed from the data, which portrays balanced relationships and identifies potential areas of dissatisfaction for major donors. We conclude by offering suggestions for NPOs seeking to understand the complexities of major gift relationships, with practical implications for meeting donors’ needs.
Resumo:
To ensure infrastructure assets are procured and maintained by government on behalf of citizens, appropriate policy and institutional architecture is needed, particularly if a fundamental shift to more sustainable infrastructure is the goal. The shift in recent years from competitive and resource-intensive procurement to more collaborative and sustainable approaches to infrastructure governance is considered a major transition in infrastructure procurement systems. In order to better understand this transition in infrastructure procurement arrangements, the concept of emergence from Complex Adaptive Systems (CAS) theory is offered as a key construct. Emergence holds that micro interactions can result in emergent macro order. Applying the concept of emergence to infrastructure procurement, this research examines how interaction of agents in individual projects can result in different industry structural characteristics. The paper concludes that CAS theory, and particularly the concept of ‘emergence’, provides a useful construct to understand infrastructure procurement dynamics and progress towards sustainability.
Resumo:
The purpose of this paper is to provide some insights about P2M, and more specifically, to develop some thoughts about Project Management seen as a Mirror, a place for reflection…, between the Mission of organisation and its actual creation of Values (with s: a source of value for people, organisations and society). This place is the realm of complexity, of interactions between multiple variables, each of them having a specific time horizon and occupying a specific place, playing a specific role. Before developing this paper I would like to borrow to my colleague and friend, Professor Ohara, the following, part of a paper going to be presented at IPMA World Congress, in New Delhi later this year in November 2005. “P2M is the Japanese version of project & program management, which is the first standard guide for education and certification developed in 2001. A specific finding of P2M is characterized by “mission driven management of projects” or a program which harness complexity of problem solving observed in the interface between technical system and business model.” (Ohara, 2005, IPMA Conference, New Delhi) “The term of “mission” is a key word in the field of corporate strategy, where it expresses raison d’être or “value of business”. It is more specifically used for expressing “the client needs” in terms of a strategic business unit. The concept of mission is deemed to be a useful tool to share essential content of value and needs in message for complex project.” (Ohara, 2005, IPMA Conference, New Delhi) “Mission is considered as a significant “metamodel representation” by several reasons. First, it represents multiple values for aspiration. The central objective of mission initiative is profiling of ideality in the future from reality, which all stakeholders are glad to accept and share. Second, it shall be within a stretch of efforts, and not beyond or outside of the realization. Though it looks like unique, it has to depict a solid foundation. The pragmatic sense of equilibrium between innovation and adaptation is required for the mission. Third, it shall imply a rough sketch for solution to critical issues for problems in reality.” (Ohara, 2005, IPMA Conference, New Delhi) “Project modeling” idea has been introduced in P2M program management. A package of three project models of “scheme”, “system” and “service” are given as a reference type program. (Ohara, 2005, IPMA Conference, New Delhi) If these quotes apply to P2M, they are fully congruent with the results of the research undertaken and the resulting meta-model & meta-method developed by the CIMAP, ESC Lille Research Centre in Project & Program Management, since the 80’s. The paper starts by questioning the common Project Management (PM) paradigm. Then discussing the concept of Project, it argues that an alternative epistemological position should be taken to capture Page 2 / 11 the very nature of the PM field. Based on this, a development about “the need of modelling to understand” is proposed grounded on two theoretical roots. This leads to the conclusion that, in order to enables this modelling, a standard approach is necessary, but should be understood under the perspective of the Theory of Convention in order to facilitate a situational and contextual application.
Resumo:
Airports represent the epitome of complex systems with multiple stakeholders, multiple jurisdictions and complex interactions between many actors. The large number of existing models that capture different aspects of the airport are a testament to this. However, these existing models do not consider in a systematic sense modelling requirements nor how stakeholders such as airport operators or airlines would make use of these models. This can detrimentally impact on the verification and validation of models and makes the development of extensible and reusable modelling tools difficult. This paper develops from the Concept of Operations (CONOPS) framework a methodology to help structure the review and development of modelling capabilities and usage scenarios. The method is applied to the review of existing airport terminal passenger models. It is found that existing models can be broadly categorised according to four usage scenarios: capacity planning, operational planning and design, security policy and planning, and airport performance review. The models, the performance metrics that they evaluate and their usage scenarios are discussed. It is found that capacity and operational planning models predominantly focus on performance metrics such as waiting time, service time and congestion whereas performance review models attempt to link those to passenger satisfaction outcomes. Security policy models on the other hand focus on probabilistic risk assessment. However, there is an emerging focus on the need to be able to capture trade-offs between multiple criteria such as security and processing time. Based on the CONOPS framework and literature findings, guidance is provided for the development of future airport terminal models.
Resumo:
The design of pre-contoured fracture fixation implants (plates and nails) that correctly fit the anatomy of a patient utilises 3D models of long bones with accurate geometric representation. 3D data is usually available from computed tomography (CT) scans of human cadavers that generally represent the above 60 year old age group. Thus, despite the fact that half of the seriously injured population comes from the 30 year age group and below, virtually no data exists from these younger age groups to inform the design of implants that optimally fit patients from these groups. Hence, relevant bone data from these age groups is required. The current gold standard for acquiring such data–CT–involves ionising radiation and cannot be used to scan healthy human volunteers. Magnetic resonance imaging (MRI) has been shown to be a potential alternative in the previous studies conducted using small bones (tarsal bones) and parts of the long bones. However, in order to use MRI effectively for 3D reconstruction of human long bones, further validations using long bones and appropriate reference standards are required. Accurate reconstruction of 3D models from CT or MRI data sets requires an accurate image segmentation method. Currently available sophisticated segmentation methods involve complex programming and mathematics that researchers are not trained to perform. Therefore, an accurate but relatively simple segmentation method is required for segmentation of CT and MRI data. Furthermore, some of the limitations of 1.5T MRI such as very long scanning times and poor contrast in articular regions can potentially be reduced by using higher field 3T MRI imaging. However, a quantification of the signal to noise ratio (SNR) gain at the bone - soft tissue interface should be performed; this is not reported in the literature. As MRI scanning of long bones has very long scanning times, the acquired images are more prone to motion artefacts due to random movements of the subject‟s limbs. One of the artefacts observed is the step artefact that is believed to occur from the random movements of the volunteer during a scan. This needs to be corrected before the models can be used for implant design. As the first aim, this study investigated two segmentation methods: intensity thresholding and Canny edge detection as accurate but simple segmentation methods for segmentation of MRI and CT data. The second aim was to investigate the usability of MRI as a radiation free imaging alternative to CT for reconstruction of 3D models of long bones. The third aim was to use 3T MRI to improve the poor contrast in articular regions and long scanning times of current MRI. The fourth and final aim was to minimise the step artefact using 3D modelling techniques. The segmentation methods were investigated using CT scans of five ovine femora. The single level thresholding was performed using a visually selected threshold level to segment the complete femur. For multilevel thresholding, multiple threshold levels calculated from the threshold selection method were used for the proximal, diaphyseal and distal regions of the femur. Canny edge detection was used by delineating the outer and inner contour of 2D images and then combining them to generate the 3D model. Models generated from these methods were compared to the reference standard generated using the mechanical contact scans of the denuded bone. The second aim was achieved using CT and MRI scans of five ovine femora and segmenting them using the multilevel threshold method. A surface geometric comparison was conducted between CT based, MRI based and reference models. To quantitatively compare the 1.5T images to the 3T MRI images, the right lower limbs of five healthy volunteers were scanned using scanners from the same manufacturer. The images obtained using the identical protocols were compared by means of SNR and contrast to noise ratio (CNR) of muscle, bone marrow and bone. In order to correct the step artefact in the final 3D models, the step was simulated in five ovine femora scanned with a 3T MRI scanner. The step was corrected using the iterative closest point (ICP) algorithm based aligning method. The present study demonstrated that the multi-threshold approach in combination with the threshold selection method can generate 3D models from long bones with an average deviation of 0.18 mm. The same was 0.24 mm of the single threshold method. There was a significant statistical difference between the accuracy of models generated by the two methods. In comparison, the Canny edge detection method generated average deviation of 0.20 mm. MRI based models exhibited 0.23 mm average deviation in comparison to the 0.18 mm average deviation of CT based models. The differences were not statistically significant. 3T MRI improved the contrast in the bone–muscle interfaces of most anatomical regions of femora and tibiae, potentially improving the inaccuracies conferred by poor contrast of the articular regions. Using the robust ICP algorithm to align the 3D surfaces, the step artefact that occurred by the volunteer moving the leg was corrected, generating errors of 0.32 ± 0.02 mm when compared with the reference standard. The study concludes that magnetic resonance imaging, together with simple multilevel thresholding segmentation, is able to produce 3D models of long bones with accurate geometric representations. The method is, therefore, a potential alternative to the current gold standard CT imaging.
Resumo:
This thesis reports research focused on the well-being and employment experiences of mothers who have a child with special health care needs. Data are drawn from Growing Up in Australia: The Longitudinal Study of Australian Children (LSAC). This is a public access database. The thesis uses the social ecological theory of Bronfenbrenner (1984) and the work of Zubrick et al. (2000) on human and social capital to inform the conceptual framework developed for the research. Four studies are reported. LSAC has a nationally representative sample of Australian children and their families. The study is tracking the development of 10,000 children, with data collected every two years, from 2004 to 2018. This thesis uses data from the Kindergarten Cohort of LSAC. The 4,983 children in the Kindergarten Cohort were aged 4 years at recruitment into the study in 2004. The analyses in this thesis use child and family data from Wave 1 (2004) and Wave 2 (2006) for a subsample of the children who are identified as having special health needs. This identification is based on a short screening questionnaire included in the Parent 1 Interview at each wave of the data collection. It is the children who are identified as having special health care needs which can be broadly defined as chronic health conditions or developmental difficulties. However, it is the well-being and employment experiences of the mothers of these children that are the primary focus in three of the four studies reported in this thesis.
Resumo:
The fashion ecosystem is at boiling point as consumers turn up the heat in all areas of the fashion value, trend and supply chain. While traditionally fashion has been a monologue from designer brand to consumer, new technology and the virtual world has given consumers a voice to engage brands in a conversation to express evolving needs, ideas and feedback. Product customisation is no longer innovative. Successful brands are including customers in the design process and holding conversations ‘with’ them to improve product, manufacturing, sales, distribution, marketing and sustainable business practices. Co-creation and crowd sourcing are integral to any successful business model and designers and manufacturers are supplying the technology or tools for these creative, active, participatory ‘prosumers’. With this collaboration however, there arises a worrying trend for fashion professionals. The ‘design it yourself’, ‘indiepreneur’ who with the combination of technology, the internet, excess manufacturing capacity, crowd funding and the idea of sharing the creative integrity of a product (‘copyleft’ not copyright) is challenging the notion that the fashion supply chain is complex. The passive ‘consumer’ no longer exists. Fashion designers now share the stage with ‘amateur’ creators who are disrupting every activity they touch, while being motivated by profit as well as a quest for originality and innovation. This paper examines the effects this ‘consumer’ engagement is having on traditional fashion models and the fashion supply chain. Crowd sourcing, crowd funding, co-creating, design it yourself, global sourcing, the virtual supply chain, social media, online shopping, group buying, consumer to consumer marketing and retail, and branding the ‘individual’ are indicative of the new consumer-driven fashion models. Consumers now drive the fashion industry - from setting trends, through to creating, producing, selling and marketing product. They can turn up the heat at any time _ and any point _ in the fashion supply chain. They are raising the temperature at each and every stage of the chain, decreasing or eliminating the processes involved: decreasing the risk of fashion obsolescence, quantities for manufacture, complexity of distribution and the consumption of product; eliminating certain stages altogether and limiting the brand as custodians of marketing. Some brands are discovering a new ‘enemy’ – the very people they are trying to sell to. Keywords: fashion supply chain, virtual world, consumer, ‘prosumers’, co-creation, fashion designers
Creativity in policing: building the necessary skills to solve complex and protracted investigations
Resumo:
Despite an increased focus on proactive policing in recent years, criminal investigation is still perhaps the most important task of any law enforcement agency. As a result, the skills required to carry out a successful investigation or to be an ‘effective detective’ have been subjected to much attention and debate (Smith and Flanagan, 2000; Dean, 2000; Fahsing and Gottschalk, 2008:652). Stelfox (2008:303) states that “The service’s capacity to carry out investigations comprises almost entirely the expertise of investigators.” In this respect, Dean (2000) highlighted the need to profile criminal investigators in order to promote further understanding of the cognitive approaches they take to the process of criminal investigation. As a result of his research, Dean (2000) produced a theoretical framework of criminal investigation, which included four disparate cognitive or ‘thinking styles’. These styles were the ‘Method’, ‘Challenge’, ‘Skill’ and ‘Risk’. While the Method and Challenge styles deal with adherence to Standard Operating Procedures (SOPs) and the internal ‘drive’ that keeps an investigator going, the Skill and Risk styles both tap on the concept of creativity in policing. It is these two latter styles that provide the focus for this paper. This paper presents a brief discussion on Dean’s (2000) Skill and Risk styles before giving an overview of the broader literature on creativity in policing. The potential benefits of a creative approach as well as some hurdles which need to be overcome when proposing the integration of creativity within the policing sector are then discussed. Finally, the paper concludes by proposing further research into Dean’s (2000) skill and risk styles and also by stressing the need for significant changes to the structure and approach of the traditional policing organisation before creativity in policing is given the status it deserves.
Resumo:
A key function of activated macrophages is to secrete proinflammatory cytokines such as TNF; however, the intracellular pathway and machinery responsible for cytokine trafficking and secretion is largely undefined. Here we show that individual SNARE proteins involved in vesicle docking and fusion are regulated at both gene and protein expression upon stimulation with the bacterial cell wall component lipopolysaccharide. Focusing on two intracellular SNARE proteins, Vti1b and syntaxin 6 (Stx6), we show that they are up-regulated in conjunction with increasing cytokine secretion in activated macrophages and that their levels are selectively titrated to accommodate the volume and timing of post-Golgi cytokine trafficking. In macrophages, Vti1b and syntaxin 6 are localized on intracellular membranes and are present on isolated Golgi membranes and on Golgi-derived TNF� vesicles budded in vitro. By immunoprecipitation, we find that Vti1b and syntaxin 6 interact to form a novel intracellular Q-SNARE complex. Functional studies using overexpression of full-length and truncated proteins show that both Vti1b and syntaxin 6 function and have rate-limiting roles in TNF� trafficking and secretion. This study shows how macrophages have uniquely adapted a novel Golgi-associated SNARE complex to accommodate their requirement for increased cytokine secretion.