578 resultados para MULTIPLE-ELECTRON-CAPTURE


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this paper is to provide some insights about P2M, and more specifically, to develop some thoughts about Project Management seen as a Mirror, a place for reflection…, between the Mission of organisation and its actual creation of Values (with s: a source of value for people, organisations and society). This place is the realm of complexity, of interactions between multiple variables, each of them having a specific time horizon and occupying a specific place, playing a specific role. Before developing this paper I would like to borrow to my colleague and friend, Professor Ohara, the following, part of a paper going to be presented at IPMA World Congress, in New Delhi later this year in November 2005. “P2M is the Japanese version of project & program management, which is the first standard guide for education and certification developed in 2001. A specific finding of P2M is characterized by “mission driven management of projects” or a program which harness complexity of problem solving observed in the interface between technical system and business model.” (Ohara, 2005, IPMA Conference, New Delhi) “The term of “mission” is a key word in the field of corporate strategy, where it expresses raison d’être or “value of business”. It is more specifically used for expressing “the client needs” in terms of a strategic business unit. The concept of mission is deemed to be a useful tool to share essential content of value and needs in message for complex project.” (Ohara, 2005, IPMA Conference, New Delhi) “Mission is considered as a significant “metamodel representation” by several reasons. First, it represents multiple values for aspiration. The central objective of mission initiative is profiling of ideality in the future from reality, which all stakeholders are glad to accept and share. Second, it shall be within a stretch of efforts, and not beyond or outside of the realization. Though it looks like unique, it has to depict a solid foundation. The pragmatic sense of equilibrium between innovation and adaptation is required for the mission. Third, it shall imply a rough sketch for solution to critical issues for problems in reality.” (Ohara, 2005, IPMA Conference, New Delhi) “Project modeling” idea has been introduced in P2M program management. A package of three project models of “scheme”, “system” and “service” are given as a reference type program. (Ohara, 2005, IPMA Conference, New Delhi) If these quotes apply to P2M, they are fully congruent with the results of the research undertaken and the resulting meta-model & meta-method developed by the CIMAP, ESC Lille Research Centre in Project & Program Management, since the 80’s. The paper starts by questioning the common Project Management (PM) paradigm. Then discussing the concept of Project, it argues that an alternative epistemological position should be taken to capture Page 2 / 11 the very nature of the PM field. Based on this, a development about “the need of modelling to understand” is proposed grounded on two theoretical roots. This leads to the conclusion that, in order to enables this modelling, a standard approach is necessary, but should be understood under the perspective of the Theory of Convention in order to facilitate a situational and contextual application.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article presents a critical analysis of the current and proposed CCS legal frameworks across a number of jurisdictions in Australia in order to examine the legal treatment of the risks of carbon leakage from CCS operations. It does so through an analysis of the statutory obligations and liability rules established under the offshore Commonwealth and Victorian regimes, and onshore Queensland and Victorian legislative frameworks. Exposure draft legislation for CCS laws in Western Australia is also examined. In considering where the losses will fall in the event of leakage, the potential tortious and statutory liabilities of private operators and the State are addressed alongside the operation of statutory protections from liability. The current legal treatment of CCS under the new Australian Carbon Pricing Mechanism is also critiqued.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Airports represent the epitome of complex systems with multiple stakeholders, multiple jurisdictions and complex interactions between many actors. The large number of existing models that capture different aspects of the airport are a testament to this. However, these existing models do not consider in a systematic sense modelling requirements nor how stakeholders such as airport operators or airlines would make use of these models. This can detrimentally impact on the verification and validation of models and makes the development of extensible and reusable modelling tools difficult. This paper develops from the Concept of Operations (CONOPS) framework a methodology to help structure the review and development of modelling capabilities and usage scenarios. The method is applied to the review of existing airport terminal passenger models. It is found that existing models can be broadly categorised according to four usage scenarios: capacity planning, operational planning and design, security policy and planning, and airport performance review. The models, the performance metrics that they evaluate and their usage scenarios are discussed. It is found that capacity and operational planning models predominantly focus on performance metrics such as waiting time, service time and congestion whereas performance review models attempt to link those to passenger satisfaction outcomes. Security policy models on the other hand focus on probabilistic risk assessment. However, there is an emerging focus on the need to be able to capture trade-offs between multiple criteria such as security and processing time. Based on the CONOPS framework and literature findings, guidance is provided for the development of future airport terminal models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

“Particle Wave” is comprised of six lenticular panels hung in an even, horizontal sequence. Each panel alternates between two solid colour fields as you move past it. There are six colours in total, with each colour represented twice in the spectrum. From left to right, the panels move through yellow, orange, magenta, violet, blue, green and back to yellow. The work’s title refers to the two competing theories of light, which can be understood as either paradoxical or complementary. Like these theories, the experience of viewing the work catches us in a double bind. While we can orient ourselves to see solid colour fields one by one, we are never able to fully capture them all at once. In fact, it is only through our continual movement, and the subsequent transitioning of visible colours that we register the complete spectrum. Through this viewing experience, “Particle Wave” actively engages with our peripheral vision and the transitory nature of perception. It plays with the fundamental pleasures of colour and vision, and the uneasy seduction of being unable to grasp multiple phenomena simultaneously.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Current conceptualizations of organizational processes consider them as internally optimized yet static systems. Still, turbulences in the contextual environment of a firm often lead to adaptation requirements that these processes are unable to fulfil. Based on a multiple case study of the core processes of two large organizations, we offer an extended conceptualisation of business processes as complex adaptive systems. This conceptualization can enable firms to optimise business processes by analysing operations in different contexts and by examining the complex interaction between external, contextual elements and internal agent schemata. From this analysis, we discuss how information technology can play a vital goal in achieving this goal by providing discovery, analysis, and automation support. We detail implications for research and practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ubiquitylation is a necessary step in the endocytosis and lysosomal trafficking of many plasma membrane proteins and can also influence protein trafficking in the biosynthetic pathway. Although a molecular understanding of ubiquitylation in these processes is beginning to emerge, very little is known about the role deubiquitylation may play. Fat Facets in mouse (FAM) is substrate-specific deubiquitylating enzyme highly expressed in epithelia where it interacts with its substrate, β-catenin. Here we show, in the polarized intestinal epithelial cell line T84, FAM localized to multiple points of protein trafficking. FAM interacted with β-catenin and E-cadherin in T84 cells but only in subconfluent cultures. FAM extensively colocalized with β-catenin in cytoplasmic puncta but not at sites of cell-cell contact as well as immunoprecipitating with β-catenin and E-cadherin from a higher molecular weight complex (~500 kDa). At confluence FAM neither colocalized with, nor immunoprecipitated, β-catenin or E-cadherin, which were predominantly in a larger molecular weight complex (~2 MDa) at the cell surface. Overexpression of FAM in MCF-7 epithelial cells resulted in increased β-catenin levels, which localized to the plasma membrane. Expression of E-cadherin in L-cell fibroblasts resulted in the relocalization of FAM from the Golgi to cytoplasmic puncta. These data strongly suggest that FAM associates with E-cadherin and β-catenin during trafficking to the plasma membrane.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sequence data often have competing signals that are detected by network programs or Lento plots. Such data can be formed by generating sequences on more than one tree, and combining the results, a mixture model. We report that with such mixture models, the estimates of edge (branch) lengths from maximum likelihood (ML) methods that assume a single tree are biased. Based on the observed number of competing signals in real data, such a bias of ML is expected to occur frequently. Because network methods can recover competing signals more accurately, there is a need for ML methods allowing a network. A fundamental problem is that mixture models can have more parameters than can be recovered from the data, so that some mixtures are not, in principle, identifiable. We recommend that network programs be incorporated into best practice analysis, along with ML and Bayesian trees.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An anatase TiO 2 material with hierarchically structured spheres consisting of ultrathin nanosheets with 100% of the [001] facet exposed was employed to fabricate dye-sensitized solar cells (DSC s). Investigation of the electron transport and back reaction of the DSCs by electrochemical impedance spectroscopy showed that the spheres had a threefold lower electron recombination rate compared to the conventional TiO 2 nanoparticles. In contrast, the effective electron diffusion coefficient, D n, was not sensitive to the variation of the TiO 2 morphology. The TiO 2 spheres showed the same Dn as that of the nanoparticles. The influence of TiCl 4 post-treatment on the conduction band of the TiO 2 spheres and on the kinetics of electron transport and back reactions was also investigated. It was found that the TiCl 4 post-treatment caused a downward shift of the TiO 2 conduction band edge by 30 meV. Meanwhile, a fourfold increase of the effective electron lifetime of the DSC was also observed after TiCl4 treatment. The synergistic effect of the variation of the TiO 2 conduction band and the electron recombination determined the open-circuit voltage of the DSC. © 2012 Wang et al.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Milk proteins are susceptible to chemical changes during processing and storage. We used proteomic tools to analyse bovine αS1-casein in UHT milk. 2-D gels of freshly processed milk αS1-casein was presented as five or more spots due to genetic polymorphism and variable phosphorylation. MS analysis after phosphopeptide enrichment allowed discrimination between phosphorylation states and genetic variants. We identified a new alternatively-spliced isoform with a deletion of exon 17, producing a new C-terminal sequence, K164SQVNSEGLHSYGL177, with a novel phosphorylation site at S174. Storage of UHT milk at elevated temperatures produced additional, more acidic αS1-casein spots on the gels and decreased the resolution of minor forms. MS analysis indicated that non-enzymatic deamidation and loss of the N-terminal dipeptide were the major contributors to the changing spot pattern. These results highlight the important role of storage temperature in the stability of milk proteins and the utility of proteomic techniques for analysis of proteins in food.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The potential of multiple distribution static synchronous compensators (DSTATCOMs) to improve the voltage profile of radial distribution networks has been reported in the literature by few authors. However, the operation of multiple DSTATCOMs across a distribution feeder may introduce control interactions and/or voltage instability. This study proposes a control scheme that alleviates interactions among controllers and enhances proper reactive power sharing among DSTATCOMs. A generalised mathematical model is presented to analyse the interactions among any number of DSTATCOMs in the network. The criterion for controller design is developed by conducting eigenvalue analysis on this mathematical model. The proposed control scheme is tested in time domain on a sample radial distribution feeder installed with multiple DSTATCOMs and test results are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background When large scale trials are investigating the effects of interventions on appetite, it is paramount to efficiently monitor large amounts of human data. The original hand-held Electronic Appetite Ratings System (EARS) was designed to facilitate the administering and data management of visual analogue scales (VAS) of subjective appetite sensations. The purpose of this study was to validate a novel hand-held method (EARS II (HP® iPAQ)) against the standard Pen and Paper (P&P) method and the previously validated EARS. Methods Twelve participants (5 male, 7 female, aged 18-40) were involved in a fully repeated measures design. Participants were randomly assigned in a crossover design, to either high fat (>48% fat) or low fat (<28% fat) meal days, one week apart and completed ratings using the three data capture methods ordered according to Latin Square. The first set of appetite sensations was completed in a fasted state, immediately before a fixed breakfast. Thereafter, appetite sensations were completed every thirty minutes for 4h. An ad libitum lunch was provided immediately before completing a final set of appetite sensations. Results Repeated measures ANOVAs were conducted for ratings of hunger, fullness and desire to eat. There were no significant differences between P&P compared with either EARS or EARS II (p > 0.05). Correlation coefficients between P&P and EARS II, controlling for age and gender, were performed on Area Under the Curve ratings. R2 for Hunger (0.89), Fullness (0.96) and Desire to Eat (0.95) were statistically significant (p < 0.05). Conclusions EARS II was sensitive to the impact of a meal and recovery of appetite during the postprandial period and is therefore an effective device for monitoring appetite sensations. This study provides evidence and support for further validation of the novel EARS II method for monitoring appetite sensations during large scale studies. The added versatility means that future uses of the system provides the potential to monitor a range of other behavioural and physiological measures often important in clinical and free living trials.