880 resultados para group membership models
Resumo:
With the prospect of exascale computing, computational methods requiring only local data become especially attractive. Consequently, the typical domain decomposition of atmospheric models means horizontally-explicit vertically-implicit (HEVI) time-stepping schemes warrant further attention. In this analysis, Runge-Kutta implicit-explicit schemes from the literature are analysed for their stability and accuracy using a von Neumann stability analysis of two linear systems. Attention is paid to the numerical phase to indicate the behaviour of phase and group velocities. Where the analysis is tractable, analytically derived expressions are considered. For more complicated cases, amplification factors have been numerically generated and the associated amplitudes and phase diagnosed. Analysis of a system describing acoustic waves has necessitated attributing the three resultant eigenvalues to the three physical modes of the system. To do so, a series of algorithms has been devised to track the eigenvalues across the frequency space. The result enables analysis of whether the schemes exactly preserve the non-divergent mode; and whether there is evidence of spurious reversal in the direction of group velocities or asymmetry in the damping for the pair of acoustic modes. Frequency ranges that span next-generation high-resolution weather models to coarse-resolution climate models are considered; and a comparison is made of errors accumulated from multiple stability-constrained shorter time-steps from the HEVI scheme with a single integration from a fully implicit scheme over the same time interval. Two schemes, “Trap2(2,3,2)” and “UJ3(1,3,2)”, both already used in atmospheric models, are identified as offering consistently good stability and representation of phase across all the analyses. Furthermore, according to a simple measure of computational cost, “Trap2(2,3,2)” is the least expensive.
Resumo:
Purpose – Multinationals have always needed an operating model that works – an effective plan for executing their most important activities at the right levels of their organization, whether globally, regionally or locally. The choices involved in these decisions have never been obvious, since international firms have consistently faced trade‐offs between tailoring approaches for diverse local markets and leveraging their global scale. This paper seeks a more in‐depth understanding of how successful firms manage the global‐local trade‐off in a multipolar world. Design methodology/approach – This paper utilizes a case study approach based on in‐depth senior executive interviews at several telecommunications companies including Tata Communications. The interviews probed the operating models of the companies we studied, focusing on their approaches to organization structure, management processes, management technologies (including information technology (IT)) and people/talent. Findings – Successful companies balance global‐local trade‐offs by taking a flexible and tailored approach toward their operating‐model decisions. The paper finds that successful companies, including Tata Communications, which is profiled in‐depth, are breaking up the global‐local conundrum into a set of more manageable strategic problems – what the authors call “pressure points” – which they identify by assessing their most important activities and capabilities and determining the global and local challenges associated with them. They then design a different operating model solution for each pressure point, and repeat this process as new strategic developments emerge. By doing so they not only enhance their agility, but they also continually calibrate that crucial balance between global efficiency and local responsiveness. Originality/value – This paper takes a unique approach to operating model design, finding that an operating model is better viewed as several distinct solutions to specific “pressure points” rather than a single and inflexible model that addresses all challenges equally. Now more than ever, developing the right operating model is at the top of multinational executives' priorities, and an area of increasing concern; the international business arena has changed drastically, requiring thoughtfulness and flexibility instead of standard formulas for operating internationally. Old adages like “think global and act local” no longer provide the universal guidance they once seemed to.
Resumo:
The global characteristics of tropical cyclones (TCs) simulated by several climate models are analyzed and compared with observations. The global climate models were forced by the same sea surface temperature (SST) fields in two types of experiments, using climatological SST and interannually varying SST. TC tracks and intensities are derived from each model's output fields by the group who ran that model, using their own preferred tracking scheme; the study considers the combination of model and tracking scheme as a single modeling system, and compares the properties derived from the different systems. Overall, the observed geographic distribution of global TC frequency was reasonably well reproduced. As expected, with the exception of one model, intensities of the simulated TC were lower than in observations, to a degree that varies considerably across models.
Resumo:
While a quantitative climate theory of tropical cyclone formation remains elusive, considerable progress has been made recently in our ability to simulate tropical cyclone climatologies and understand the relationship between climate and tropical cyclone formation. Climate models are now able to simulate a realistic rate of global tropical cyclone formation, although simulation of the Atlantic tropical cyclone climatology remains challenging unless horizontal resolutions finer than 50 km are employed. This article summarizes published research from the idealized experiments of the Hurricane Working Group of U.S. CLIVAR (CLImate VARiability and predictability of the ocean-atmosphere system). This work, combined with results from other model simulations, has strengthened relationships between tropical cyclone formation rates and climate variables such as mid-tropospheric vertical velocity, with decreased climatological vertical velocities leading to decreased tropical cyclone formation. Systematic differences are shown between experiments in which only sea surface temperature is increased versus experiments where only atmospheric carbon dioxide is increased, with the carbon dioxide experiments more likely to demonstrate the decrease in tropical cyclone numbers previously shown to be a common response of climate models in a warmer climate. Experiments where the two effects are combined also show decreases in numbers, but these tend to be less for models that demonstrate a strong tropical cyclone response to increased sea surface temperatures. Further experiments are proposed that may improve our understanding of the relationship between climate and tropical cyclone formation, including experiments with two-way interaction between the ocean and the atmosphere and variations in atmospheric aerosols.
Resumo:
Climate change is amplified in the Arctic region. Arctic amplification has been found in past warm1 and glacial2 periods, as well as in historical observations3, 4 and climate model experiments5, 6. Feedback effects associated with temperature, water vapour and clouds have been suggested to contribute to amplified warming in the Arctic, but the surface albedo feedback—the increase in surface absorption of solar radiation when snow and ice retreat—is often cited as the main contributor7, 8, 9, 10. However, Arctic amplification is also found in models without changes in snow and ice cover11, 12. Here we analyse climate model simulations from the Coupled Model Intercomparison Project Phase 5 archive to quantify the contributions of the various feedbacks. We find that in the simulations, the largest contribution to Arctic amplification comes from a temperature feedbacks: as the surface warms, more energy is radiated back to space in low latitudes, compared with the Arctic. This effect can be attributed to both the different vertical structure of the warming in high and low latitudes, and a smaller increase in emitted blackbody radiation per unit warming at colder temperatures. We find that the surface albedo feedback is the second main contributor to Arctic amplification and that other contributions are substantially smaller or even opposeArctic amplification.
Resumo:
The term neural population models (NPMs) is used here as catchall for a wide range of approaches that have been variously called neural mass models, mean field models, neural field models, bulk models, and so forth. All NPMs attempt to describe the collective action of neural assemblies directly. Some NPMs treat the densely populated tissue of cortex as an excitable medium, leading to spatially continuous cortical field theories (CFTs). An indirect approach would start by modelling individual cells and then would explain the collective action of a group of cells by coupling many individual models together. In contrast, NPMs employ collective state variables, typically defined as averages over the group of cells, in order to describe the population activity directly in a single model. The strength and the weakness of his approach are hence one and the same: simplification by bulk. Is this justified and indeed useful, or does it lead to oversimplification which fails to capture the pheno ...
Resumo:
Second language acquisition researchers often face particular challenges when attempting to generalize study findings to the wider learner population. For example, language learners constitute a heterogeneous group, and it is not always clear how a study’s findings may generalize to other individuals who may differ in terms of language background and proficiency, among many other factors. In this paper, we provide an overview of how mixed-effects models can be used to help overcome these and other issues in the field of second language acquisition. We provide an overview of the benefits of mixed-effects models and a practical example of how mixed-effects analyses can be conducted. Mixed-effects models provide second language researchers with a powerful statistical tool in the analysis of a variety of different types of data.
Resumo:
The dynamical processes that lead to open cluster disruption cause its mass to decrease. To investigate such processes from the observational point of view, it is important to identify open cluster remnants (OCRs), which are intrinsically poorly populated. Due to their nature, distinguishing them from field star fluctuations is still an unresolved issue. In this work, we developed a statistical diagnostic tool to distinguish poorly populated star concentrations from background field fluctuations. We use 2MASS photometry to explore one of the conditions required for a stellar group to be a physical group: to produce distinct sequences in a colour-magnitude diagram (CMD). We use automated tools to (i) derive the limiting radius; (ii) decontaminate the field and assign membership probabilities; (iii) fit isochrones; and (iv) compare object and field CMDs, considering the isochrone solution, in order to verify the similarity. If the object cannot be statistically considered as a field fluctuation, we derive its probable age, distance modulus, reddening and uncertainties in a self-consistent way. As a test, we apply the tool to open clusters and comparison fields. Finally, we study the OCR candidates DoDz 6, NGC 272, ESO 435 SC48 and ESO 325 SC15. The tool is optimized to treat these low-statistic objects and to separate the best OCR candidates for studies on kinematics and chemical composition. The study of the possible OCRs will certainly provide a deep understanding of OCR properties and constraints for theoretical models, including insights into the evolution of open clusters and dissolution rates.
Resumo:
The neuromuscular disorders are a heterogeneous group of genetic diseases, caused by mutations in genes coding sarcolemmal, sarcomeric, and citosolic muscle proteins. Deficiencies or loss of function of these proteins leads to variable degree of progressive loss of motor ability. Several animal models, manifesting phenotypes observed in neuromuscular diseases, have been identified in nature or generated in laboratory. These models generally present physiological alterations observed in human patients and can be used as important tools for genetic, clinic, and histopathological studies. The mdx mouse is the most widely used animal model for Duchenne muscular dystrophy (DMD). Although it is a good genetic and biochemical model, presenting total deficiency of the protein dystrophin in the muscle, this mouse is not useful for clinical trials because of its very mild phenotype. The canine golden retriever MD model represents a more clinically similar model of DMD due to its larger size and significant muscle weakness. Autosomal recessive limb-girdle MD forms models include the SJL/J mice, which develop a spontaneous myopathy resulting from a mutation in the Dysferlin gene, being a model for LGMD2B. For the human sarcoglycanopahties (SG), the BIO14.6 hamster is the spontaneous animal model for delta-SG deficiency, whereas some canine models with deficiency of SG proteins have also been identified. More recently, using the homologous recombination technique in embryonic stem cell, several mouse models have been developed with null mutations in each one of the four SG genes. All sarcoglycan-null animals display a progressive muscular dystrophy of variable severity and share the property of a significant secondary reduction in the expression of the other members of the sarcoglycan subcomplex and other components of the Dystrophin-glycoprotein complex. Mouse models for congenital MD include the dy/dy (dystrophia-muscularis) mouse and the allelic mutant dy(2J)/dy(2J) mouse, both presenting significant reduction of alpha 2-laminin in the muscle and a severe phenotype. The myodystrophy mouse (Large(myd)) harbors a mutation in the glycosyltransferase Large, which leads to altered glycosylation of alpha-DG, and also a severe phenotype. Other informative models for muscle proteins include the knockout mouse for myostatin, which demonstrated that this protein is a negative regulator of muscle growth. Additionally, the stress syndrome in pigs, caused by mutations in the porcine RYR1 gene, helped to localize the gene causing malignant hypertermia and Central Core myopathy in humans. The study of animal models for genetic diseases, in spite of the existence of differences in some phenotypes, can provide important clues to the understanding of the pathogenesis of these disorders and are also very valuable for testing strategies for therapeutic approaches.
Resumo:
In this work we propose and analyze nonlinear elliptical models for longitudinal data, which represent an alternative to gaussian models in the cases of heavy tails, for instance. The elliptical distributions may help to control the influence of the observations in the parameter estimates by naturally attributing different weights for each case. We consider random effects to introduce the within-group correlation and work with the marginal model without requiring numerical integration. An iterative algorithm to obtain maximum likelihood estimates for the parameters is presented, as well as diagnostic results based on residual distances and local influence [Cook, D., 1986. Assessment of local influence. journal of the Royal Statistical Society - Series B 48 (2), 133-169; Cook D., 1987. Influence assessment. journal of Applied Statistics 14 (2),117-131; Escobar, L.A., Meeker, W.Q., 1992, Assessing influence in regression analysis with censored data, Biometrics 48, 507-528]. As numerical illustration, we apply the obtained results to a kinetics longitudinal data set presented in [Vonesh, E.F., Carter, R.L., 1992. Mixed-effects nonlinear regression for unbalanced repeated measures. Biometrics 48, 1-17], which was analyzed under the assumption of normality. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The coexistence between different types of templates has been the choice solution to the information crisis of prebiotic evolution, triggered by the finding that a single RNA-like template cannot carry enough information to code for any useful replicase. In principle, confining d distinct templates of length L in a package or protocell, whose Survival depends on the coexistence of the templates it holds in, could resolve this crisis provided that d is made sufficiently large. Here we review the prototypical package model of Niesert et al. [1981. Origin of life between Scylla and Charybdis. J. Mol. Evol. 17, 348-353] which guarantees the greatest possible region of viability of the protocell population, and show that this model, and hence the entire package approach, does not resolve the information crisis. In particular, we show that the total information stored in a viable protocell (Ld) tends to a constant value that depends only on the spontaneous error rate per nucleotide of the template replication mechanism. As a result, an increase of d must be followed by a decrease of L, so that the net information gain is null. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Liponucleosides may assist the anchoring of nucleic acid nitrogen bases into biological membranes for tailored nanobiotechnological applications. To this end precise knowledge about the biophysical and chemical details at the membrane surface is required. In this paper, we used Langmuir monolayers as simplified cell membrane models and studied the insertion of five lipidated nucleosides. These molecules varied in the type of the covalently attached lipid group, the nucleobase, and the number of hydrophobic moieties attached to the nucleoside. All five lipidated nucleosides were found to be surface-active and capable of forming stable monolayers. They could also be incorporated into dipalmitoylphosphatidylcholine (DPPC) monolayers, four of which induced expansion in the surface pressure isotherm and a decrease in the surface compression modulus of DPPC. In contrast, one nucleoside possessing three alkyl chain modifications formed very condensed monolayers and induced film condensation and an increase in the compression modulus for the DPPC monolayer, thus reflecting the importance of the ability of the nucleoside molecules to be arranged in a closely packed manner. The implications of these results lie on the possibility of tuning nucleic acid pairing by modifying structural characteristics of the liponucleosides. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Wikipedia is a free, web-based, collaborative, multilingual encyclopedia project supported by the non-profit Wikimedia Foundation. Due to the free nature of Wikipedia and allowing open access to everyone to edit articles the quality of articles may be affected. As all people don’t have equal level of knowledge and also different people have different opinions about a topic so there may be difference between the contributions made by different authors. To overcome this situation it is very important to classify the articles so that the articles of good quality can be separated from the poor quality articles and should be removed from the database. The aim of this study is to classify the articles of Wikipedia into two classes class 0 (poor quality) and class 1(good quality) using the Adaptive Neuro Fuzzy Inference System (ANFIS) and data mining techniques. Two ANFIS are built using the Fuzzy Logic Toolbox [1] available in Matlab. The first ANFIS is based on the rules obtained from J48 classifier in WEKA while the other one was built by using the expert’s knowledge. The data used for this research work contains 226 article’s records taken from the German version of Wikipedia. The dataset consists of 19 inputs and one output. The data was preprocessed to remove any similar attributes. The input variables are related to the editors, contributors, length of articles and the lifecycle of articles. In the end analysis of different methods implemented in this research is made to analyze the performance of each classification method used.
Resumo:
The study aims to assess the empirical adherence of the permanent income theory and the consumption smoothing view in Latin America. Two present value models are considered, one describing household behavior and the other open economy macroeconomics. Following the methodology developed in Campbell and Schiller (1987), Bivariate Vector Autoregressions are estimated for the saving ratio and the real growth rate of income concerning the household behavior model and for the current account and the change in national cash ‡ow regarding the open economy model. The countries in the sample are considered separately in the estimation process (individual system estimation) as well as jointly (joint system estimation). Ordinary Least Squares (OLS) and Seemingly Unrelated Regressions (SURE) estimates of the coe¢cients are generated. Wald Tests are then conducted to verify if the VAR coe¢cient estimates are in conformity with those predicted by the theory. While the empirical results are sensitive to the estimation method and discount factors used, there is only weak evidence in favor of the permanent income theory and consumption smoothing view in the group of countries analyzed.
Resumo:
This article presents a detailed study of the application of different additive manufacturing technologies (sintering process, three-dimensional printing, extrusion and stereolithographic process), in the design process of a complex geometry model and its moving parts. The fabrication sequence was evaluated in terms of pre-processing conditions (model generation and model STL SLI), generation strategy and physical model post-processing operations. Dimensional verification of the obtained models was undertook by projecting structured light (optical scan), a relatively new technology of main importance for metrology and reverse engineering. Studies were done in certain manufacturing time and production costs, which allowed the definition of an more comprehensive evaluation matrix of additive technologies.