104 resultados para Matrix of complex negotiation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

As we enter an era of ‘big data’, asset information is becoming a deliverable of complex projects. Prior research suggests digital technologies enable rapid, flexible forms of project organizing. This research analyses practices of managing change in Airbus, CERN and Crossrail, through desk-based review, interviews, visits and a cross-case workshop. These organizations deliver complex projects, rely on digital technologies to manage large data-sets; and use configuration management, a systems engineering approach with mid-20th century origins, to establish and maintain integrity. In them, configuration management has become more, rather than less, important. Asset information is structured, with change managed through digital systems, using relatively hierarchical, asynchronous and sequential processes. The paper contributes by uncovering limits to flexibility in complex projects where integrity is important. Challenges of managing change are discussed, considering the evolving nature of configuration management; potential use of analytics on complex projects; and implications for research and practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to their broad differentiation potential and their persistence into adulthood, human neural crest-derived stem cells (NCSCs) harbour great potential for autologous cellular therapies, which include the treatment of neurodegenerative diseases and replacement of complex tissues containing various cell types, as in the case of musculoskeletal injuries. The use of serum-free approaches often results in insufficient proliferation of stem cells and foetal calf serum implicates the use of xenogenic medium components. Thus, there is much need for alternative cultivation strategies. In this study we describe for the first time a novel, human blood plasma based semi-solid medium for cultivation of human NCSCs. We cultivated human neural crest-derived inferior turbinate stem cells (ITSCs) within a blood plasma matrix, where they revealed higher proliferation rates compared to a standard serum-free approach. Three-dimensionality of the matrix was investigated using helium ion microscopy. ITSCs grew within the matrix as revealed by laser scanning microscopy. Genetic stability and maintenance of stemness characteristics were assured in 3D cultivated ITSCs, as demonstrated by unchanged expression profile and the capability for self-renewal. ITSCs pre-cultivated in the 3D matrix differentiated efficiently into ectodermal and mesodermal cell types, particularly including osteogenic cell types. Furthermore, ITSCs cultivated as described here could be easily infected with lentiviruses directly in substrate for potential tracing or gene therapeutic approaches. Taken together, the use of human blood plasma as an additive for a completely defined medium points towards a personalisable and autologous cultivation of human neural crest-derived stem cells under clinical grade conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human brain imaging techniques, such as Magnetic Resonance Imaging (MRI) or Diffusion Tensor Imaging (DTI), have been established as scientific and diagnostic tools and their adoption is growing in popularity. Statistical methods, machine learning and data mining algorithms have successfully been adopted to extract predictive and descriptive models from neuroimage data. However, the knowledge discovery process typically requires also the adoption of pre-processing, post-processing and visualisation techniques in complex data workflows. Currently, a main problem for the integrated preprocessing and mining of MRI data is the lack of comprehensive platforms able to avoid the manual invocation of preprocessing and mining tools, that yields to an error-prone and inefficient process. In this work we present K-Surfer, a novel plug-in of the Konstanz Information Miner (KNIME) workbench, that automatizes the preprocessing of brain images and leverages the mining capabilities of KNIME in an integrated way. K-Surfer supports the importing, filtering, merging and pre-processing of neuroimage data from FreeSurfer, a tool for human brain MRI feature extraction and interpretation. K-Surfer automatizes the steps for importing FreeSurfer data, reducing time costs, eliminating human errors and enabling the design of complex analytics workflow for neuroimage data by leveraging the rich functionalities available in the KNIME workbench.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The turbulent structure of a stratocumulus-topped marine boundary layer over a 2-day period is observed with a Doppler lidar at Mace Head in Ireland. Using profiles of vertical velocity statistics, the bulk of the mixing is identified as cloud driven. This is supported by the pertinent feature of negative vertical velocity skewness in the sub-cloud layer which extends, on occasion, almost to the surface. Both coupled and decoupled turbulence characteristics are observed. The length and timescales related to the cloud-driven mixing are investigated and shown to provide additional information about the structure and the source of the mixing inside the boundary layer. They are also shown to place constraints on the length of the sampling periods used to derive products, such as the turbulent dissipation rate, from lidar measurements. For this, the maximum wavelengths that belong to the inertial subrange are studied through spectral analysis of the vertical velocity. The maximum wavelength of the inertial subrange in the cloud-driven layer scales relatively well with the corresponding layer depth during pronounced decoupled structure identified from the vertical velocity skewness. However, on many occasions, combining the analysis of the inertial subrange and vertical velocity statistics suggests higher decoupling height than expected from the skewness profiles. Our results show that investigation of the length scales related to the inertial subrange significantly complements the analysis of the vertical velocity statistics and enables a more confident interpretation of complex boundary layer structures using measurements from a Doppler lidar.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Concerted evolution is normally used to describe parallel changes at different sites in a genome, but it is also observed in languages where a specific phoneme changes to the same other phoneme in many words in the lexicon—a phenomenon known as regular sound change. We develop a general statistical model that can detect concerted changes in aligned sequence data and apply it to study regular sound changes in the Turkic language family. Results: Linguistic evolution, unlike the genetic substitutional process, is dominated by events of concerted evolutionary change. Our model identified more than 70 historical events of regular sound change that occurred throughout the evolution of the Turkic language family, while simultaneously inferring a dated phylogenetic tree. Including regular sound changes yielded an approximately 4-fold improvement in the characterization of linguistic change over a simpler model of sporadic change, improved phylogenetic inference, and returned more reliable and plausible dates for events on the phylogenies. The historical timings of the concerted changes closely follow a Poisson process model, and the sound transition networks derived from our model mirror linguistic expectations. Conclusions: We demonstrate that a model with no prior knowledge of complex concerted or regular changes can nevertheless infer the historical timings and genealogical placements of events of concerted change from the signals left in contemporary data. Our model can be applied wherever discrete elements—such as genes, words, cultural trends, technologies, or morphological traits—can change in parallel within an organism or other evolving group.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a summary of the work done within the European Union's Seventh Framework Programme project ECLIPSE (Evaluating the Climate and Air Quality Impacts of Short-Lived Pollutants). ECLIPSE had a unique systematic concept for designing a realistic and effective mitigation scenario for short-lived climate pollutants (SLCPs; methane, aerosols and ozone, and their precursor species) and quantifying its climate and air quality impacts, and this paper presents the results in the context of this overarching strategy. The first step in ECLIPSE was to create a new emission inventory based on current legislation (CLE) for the recent past and until 2050. Substantial progress compared to previous work was made by including previously unaccounted types of sources such as flaring of gas associated with oil production, and wick lamps. These emission data were used for present-day reference simulations with four advanced Earth system models (ESMs) and six chemistry transport models (CTMs). The model simulations were compared with a variety of ground-based and satellite observational data sets from Asia, Europe and the Arctic. It was found that the models still underestimate the measured seasonality of aerosols in the Arctic but to a lesser extent than in previous studies. Problems likely related to the emissions were identified for northern Russia and India, in particular. To estimate the climate impacts of SLCPs, ECLIPSE followed two paths of research: the first path calculated radiative forcing (RF) values for a large matrix of SLCP species emissions, for different seasons and regions independently. Based on these RF calculations, the Global Temperature change Potential metric for a time horizon of 20 years (GTP20) was calculated for each SLCP emission type. This climate metric was then used in an integrated assessment model to identify all emission mitigation measures with a beneficial air quality and short-term (20-year) climate impact. These measures together defined a SLCP mitigation (MIT) scenario. Compared to CLE, the MIT scenario would reduce global methane (CH4) and black carbon (BC) emissions by about 50 and 80 %, respectively. For CH4, measures on shale gas production, waste management and coal mines were most important. For non-CH4 SLCPs, elimination of high-emitting vehicles and wick lamps, as well as reducing emissions from gas flaring, coal and biomass stoves, agricultural waste, solvents and diesel engines were most important. These measures lead to large reductions in calculated surface concentrations of ozone and particulate matter. We estimate that in the EU, the loss of statistical life expectancy due to air pollution was 7.5 months in 2010, which will be reduced to 5.2 months by 2030 in the CLE scenario. The MIT scenario would reduce this value by another 0.9 to 4.3 months. Substantially larger reductions due to the mitigation are found for China (1.8 months) and India (11–12 months). The climate metrics cannot fully quantify the climate response. Therefore, a second research path was taken. Transient climate ensemble simulations with the four ESMs were run for the CLE and MIT scenarios, to determine the climate impacts of the mitigation. In these simulations, the CLE scenario resulted in a surface temperature increase of 0.70 ± 0.14 K between the years 2006 and 2050. For the decade 2041–2050, the warming was reduced by 0.22 ± 0.07 K in the MIT scenario, and this result was in almost exact agreement with the response calculated based on the emission metrics (reduced warming of 0.22 ± 0.09 K). The metrics calculations suggest that non-CH4 SLCPs contribute ~ 22 % to this response and CH4 78 %. This could not be fully confirmed by the transient simulations, which attributed about 90 % of the temperature response to CH4 reductions. Attribution of the observed temperature response to non-CH4 SLCP emission reductions and BC specifically is hampered in the transient simulations by small forcing and co-emitted species of the emission basket chosen. Nevertheless, an important conclusion is that our mitigation basket as a whole would lead to clear benefits for both air quality and climate. The climate response from BC reductions in our study is smaller than reported previously, possibly because our study is one of the first to use fully coupled climate models, where unforced variability and sea ice responses cause relatively strong temperature fluctuations that may counteract (and, thus, mask) the impacts of small emission reductions. The temperature responses to the mitigation were generally stronger over the continents than over the oceans, and with a warming reduction of 0.44 K (0.39–0.49) K the largest over the Arctic. Our calculations suggest particularly beneficial climate responses in southern Europe, where surface warming was reduced by about 0.3 K and precipitation rates were increased by about 15 (6–21) mm yr−1 (more than 4 % of total precipitation) from spring to autumn. Thus, the mitigation could help to alleviate expected future drought and water shortages in the Mediterranean area. We also report other important results of the ECLIPSE project.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examines when “incremental” change is likely to trigger “discontinuous” change, using the lens of complex adaptive systems theory. Going beyond the simulations and case studies through which complex adaptive systems have been approached so far, we study the relationship between incremental organizational reconfigurations and discontinuous organizational restructurings using a large-scale database of U.S. Fortune 50 industrial corporations. We develop two types of escalation process in organizations: accumulation and perturbation. Under ordinary conditions, it is perturbation rather than the accumulation that is more likely to trigger subsequent discontinuous change. Consistent with complex adaptive systems theory, organizations are more sensitive to both accumulation and perturbation in conditions of heightened disequilibrium. Contrary to expectations, highly interconnected organizations are not more liable to discontinuous change. We conclude with implications for further research, especially the need to attend to the potential role of managerial design and coping when transferring complex adaptive systems theory from natural systems to organizational systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the increase in e-commerce and the digitisation of design data and information,the construction sector has become reliant upon IT infrastructure and systems. The design and production process is more complex, more interconnected, and reliant upon greater information mobility, with seamless exchange of data and information in real time. Construction small and medium-sized enterprises (CSMEs), in particular,the speciality contractors, can effectively utilise cost-effective collaboration-enabling technologies, such as cloud computing, to help in the effective transfer of information and data to improve productivity. The system dynamics (SD) approach offers a perspective and tools to enable a better understanding of the dynamics of complex systems. This research focuses upon system dynamics methodology as a modelling and analysis tool in order to understand and identify the key drivers in the absorption of cloud computing for CSMEs. The aim of this paper is to determine how the use of system dynamics (SD) can improve the management of information flow through collaborative technologies leading to improved productivity. The data supporting the use of system dynamics was obtained through a pilot study consisting of questionnaires and interviews from five CSMEs in the UK house-building sector.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper focuses on the language shift phenomenon in Singapore as a consequence of the top-town policies. By looking at bilingual family language policies it examines the characteristics of Singapore’s multilingual nature and cultural diversity. Specifically, it looks at what languages are practiced and how family language policies are enacted in Singaporean English-Chinese bilingual families, and to what extend macro language policies – i.e. national and educational language policies influence and interact with family language policies. Involving 545 families and including parents and grandparents as participants, the study traces the trajectory of the policy history. Data sources include 2 parts: 1) a prescribed linguistic practices survey; and 2) participant observation of actual negotiation of FLP in face-to-face social interaction in bilingual English-Chinese families. The data provides valuable information on how family language policy is enacted and language practices are negotiated, and what linguistic practices have been changed and abandoned against the background of the Speaking Mandarin Campaign and the current bilingual policy implemented in the 1970s. Importantly, the detailed face-to-face interactions and linguistics practices are able to enhance our understanding of the subtleties and processes of language (dis)continuity in relation to policy interventions. The study also discusses the reality of language management measures in contrast to the government’s ‘separate bilingualism’ (Creese & Blackledge, 2011) expectations with regard to ‘striking a balance’ between Asian and Western culture (Curdt-Christiansen & Silver 2013; Shepherd, 2005) and between English and mother tongue languages (Curdt-Christiansen, 2014). Demonstrating how parents and children negotiate their family language policy through translanguaging or heteroglossia practices (Canagarajah, 2013; Garcia & Li Wei, 2014), this paper argues that ‘striking a balance’ as a political ideology places emphasis on discrete and separate notions of cultural and linguistic categorization and thus downplays the significant influences from historical, political and sociolinguistic contexts in which people find themselves. This simplistic view of culture and linguistic code will inevitably constrain individuals’ language expression as it regards code switching and translanguaging as delimited and incompetent language behaviour.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Parkinson's disease is characterized by the progressive and selective loss of dopaminergic neurons in the substantia nigra. It has been postulated that endogenously formed CysDA (5-S-cysteinyldopamine) and its metabolites may be, in part, responsible for this selective neuronal loss, although the mechanisms by which they contribute to such neurotoxicity are not understood. Exposure of neurons in culture to CysDA caused cell injury, apparent 12-48 h post-exposure. A portion of the neuronal death induced by CysDA was preceded by a rapid uptake and intracellular oxidation of CysDA, leading to an acute and transient activation of ERK2 (extracellular-signal-regulated kinase 2) and caspase 8. The oxidation of CysDA also induced the activation of apoptosis signal-regulating kinase 1 via its de-phosphorylation at Ser967, the phosphorylation of JNK (c-Jun N-terminal kinase) and c-Jun (Ser73) as well as the activation of p38, caspase 3, caspase 8, caspase 7 and caspase 9. Concurrently, the inhibition of complex I by the dihydrobenzothiazine DHBT-1 [7-(2-aminoethyl)-3,4-dihydro-5-hydroxy-2H-1,4-benzothiazine-3-carboxylic acid], formed from the intracellular oxidation of CysDA, induces complex I inhibition and the subsequent release of cytochrome c which further potentiates pro-apoptotic mechanisms. Our data suggest a novel comprehensive mechanism for CysDA that may hold relevance for the selective neuronal loss observed in Parkinson's disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two decades ago, Canada, Mexico, and the United States created a continental economy. The road to integration from the signing of the North American Free Trade Agreement has not been a smooth one. Along the way, Mexico lived through a currency crisis, a democratic transition, and the rising challenge of Asian manufacturing. Canada stayed united despite surging Quebecois nationalism during the 1990s; since then, it has seen dramatic economic changes with the explosion of hydrocarbon production and a much stronger currency. The United States saw a stock-market bust, the shock of 9/11, and the near-collapse of its financial system. All of these events have transformed the relationships that emerged after NAFTA entered into force in 1994. Given the tremendous changes, one might be skeptical that the circumstances and details of the negotiation and ratification of NAFTA hold lessons for the future of North America. However, the road to NAFTA had its own difficulties, and many of the issues involved in the negotiations underpin today's challenges. NAFTA was conceived at a time of profound change in the international system. When Mexican leaders surveyed the world two decades ago, they saw emerging regional groupings in Europe, Asia, and South America. Faced with a lack of interest or compatibility, they instead doubled down on North America. How did Mexican leaders reconsider their national interests and redefine Mexico's role in the world in light of those transformations? Unpublished Mexican documents from SECOFI, the secretariate most involved in negotiating NAFTA, help illustrate Mexican thinking about its interests and role at that time. Combining those insights with analysis of newly available evidence from U.S. presidential archives, this paper sheds light on the negotiations that concluded two decades ago.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Genome-wide association studies (GWAS) have been widely used in genetic dissection of complex traits. However, common methods are all based on a fixed-SNP-effect mixed linear model (MLM) and single marker analysis, such as efficient mixed model analysis (EMMA). These methods require Bonferroni correction for multiple tests, which often is too conservative when the number of markers is extremely large. To address this concern, we proposed a random-SNP-effect MLM (RMLM) and a multi-locus RMLM (MRMLM) for GWAS. The RMLM simply treats the SNP-effect as random, but it allows a modified Bonferroni correction to be used to calculate the threshold p value for significance tests. The MRMLM is a multi-locus model including markers selected from the RMLM method with a less stringent selection criterion. Due to the multi-locus nature, no multiple test correction is needed. Simulation studies show that the MRMLM is more powerful in QTN detection and more accurate in QTN effect estimation than the RMLM, which in turn is more powerful and accurate than the EMMA. To demonstrate the new methods, we analyzed six flowering time related traits in Arabidopsis thaliana and detected more genes than previous reported using the EMMA. Therefore, the MRMLM provides an alternative for multi-locus GWAS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Matrix-assisted laser desorption/ionisation (MALDI) coupled with time-of-flight (TOF) mass spectrometry (MS) is a powerful tool for the analysis of biological samples, and nanoflow high-performance liquid chromatography (nanoHPLC) is a useful separation technique for the analysis of complex proteomics samples. The off-line combination of MALDI and nanoHPLC has been extensively investigated and straightforward techniques have been developed, focussing particularly on automated MALDI sample preparation that yields sensitive and reproducible spectra. Normally conventional solid MALDI matrices such as α-cyano-4-hydroxycinnamic acid (CHCA) are used for sample preparation. However, they have limited usefulness in quantitative measurements and automated data acquisition because of the formation of heterogeneous crystals, resulting in highly variable ion yields and desorption/ ionization characteristics. Glycerol-based liquid support matrices (LSM) have been proposed as an alternative to the traditional solid matrices as they provide increased shot-to-shot reproducibility, leading to prolonged and stable ion signals and therefore better results. This chapter focuses on the integration of the liquid LSM MALDI matrices into the LC-MALDI MS/MS approach in identifying complex and large proteomes. The interface between LC and MALDI consists of a robotic spotter, which fractionates the eluent from the LC column into nanoliter volumes, and co-spots simultaneously the liquid matrix with the eluent fractions onto a MALDI target plate via sheath flow. The efficiency of this method is demonstrated through the analysis of trypsin digests of both bovine serum albumin (BSA) and Lactobacillus plantarum WCFS1 proteins.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study investigated the long-term effect of classroom-based input manipulation on children’s use of subordination in a story re-telling task; it also explored the role of receptive vocabulary skills and expressive grammatical abilities in predicting the likelihood of priming. During a two-week priming phase, 47 monolingual English-speaking five- year-olds heard 10 stories, one a day, that either contained a high proportion of subordinate clauses (subordination condition) or a high proportion of coordi- nate clauses (coordination condition). Post-intervention, there was a significant group difference in likelihood of subordinate use which persisted ten weeks after the priming. Neither expressive grammatical nor receptive vocabulary skills were positively correlated with the likelihood of subordinate use. These findings show that input manipulation can have a facilitative effect on the use of complex syntax over several weeks in a realistic communicative task.