885 resultados para empirical shell model
Resumo:
Models of the air-sea transfer velocity of gases may be either empirical or mechanistic. Extrapolations of empirical models to an unmeasured gas or to another water temperature can be erroneous if the basis of that extrapolation is flawed. This issue is readily demonstrated for the most well-known empirical gas transfer velocity models where the influence of bubble-mediated transfer, which can vary between gases, is not explicitly accounted for. Mechanistic models are hindered by an incomplete knowledge of the mechanisms of air-sea gas transfer. We describe a hybrid model that incorporates a simple mechanistic view—strictly enforcing a distinction between direct and bubble-mediated transfer—but also uses parameterizations based on data from eddy flux measurements of dimethyl sulphide (DMS) to calibrate the model together with dual tracer results to evaluate the model. This model underpins simple algorithms that can be easily applied within schemes to calculate local, regional, or global air-sea fluxes of gases.
Resumo:
In establishing the reliability of performance-related design methods for concrete – which are relevant for resistance against chloride-induced corrosion - long-term experience of local materials and practices and detailed knowledge of the ambient and local micro-climate are critical. Furthermore, in the development of analytical models for performance-based design, calibration against test data representative of actual conditions in practice is required. To this end, the current study presents results from full-scale, concrete pier-stems under long-term exposure to a marine environment with work focussing on XS2 (below mid-tide level) in which the concrete is regarded as fully saturated and XS3 (tidal, splash and spray) in which the concrete is in an unsaturated condition. These exposures represent zones where concrete structures are most susceptible to ionic ingress and deterioration. Chloride profiles and chloride transport behaviour are studied using both an empirical model (erfc function) and a physical model (ClinConc). The time dependency of surface chloride concentration (Cs) and apparent diffusivity (Da) were established for the empirical model whereas, in the ClinConc model (originally based on saturated concrete), two new environmental factors were introduced for the XS3 environmental exposure zone. Although the XS3 is considered as one environmental exposure zone according to BS EN 206-1:2013, the work has highlighted that even within this zone, significant changes in chloride ingress are evident. This study aims to update the parameters of both models for predicting the long term transport behaviour of concrete subjected to environmental exposure classes XS2 and XS3.
Resumo:
We present the first 3D simulation of the last minutes of oxygen shell burning in an 18 solar mass supernova progenitor up to the onset of core collapse. A moving inner boundary is used to accurately model the contraction of the silicon and iron core according to a 1D stellar evolution model with a self-consistent treatment of core deleptonization and nuclear quasi-equilibrium. The simulation covers the full solid angle to allow the emergence of large-scale convective modes. Due to core contraction and the concomitant acceleration of nuclear burning, the convective Mach number increases to ~0.1 at collapse, and an l=2 mode emerges shortly before the end of the simulation. Aside from a growth of the oxygen shell from 0.51 to 0.56 solar masses due to entrainment from the carbon shell, the convective flow is reasonably well described by mixing length theory, and the dominant scales are compatible with estimates from linear stability analysis. We deduce that artificial changes in the physics, such as accelerated core contraction, can have precarious consequences for the state of convection at collapse. We argue that scaling laws for the convective velocities and eddy sizes furnish good estimates for the state of shell convection at collapse and develop a simple analytic theory for the impact of convective seed perturbations on shock revival in the ensuing supernova. We predict a reduction of the critical luminosity for explosion by 12--24% due to seed asphericities for our 3D progenitor model relative to the case without large seed perturbations.
Resumo:
Today a number of studies are published on how organizational strategy is developed and how organizations contribute to local and regional development through the realization of these strategies. There are also many articles dealing with the success of a project by identifying the criteria and the factors that influence them. This article introduces the project-oriented strategic planning process that reveals how projects contribute to local and regional development and demonstrates the relationship between this approach and the regional competitiveness model as well as the KRAFT concept. There is a lot of research that focuses on sustainability in business. These studies argue that sustainability is very important to the success of a business in the future. The Project Excellence Model that analyses project success does not contain the sustainability criteria; the GPM P5 standard consists of sustainability components related either to the organizational level. To fill this gap a Project Sustainability Excellence Model (PSEM) was developed. The model was tested by interviews with managers of Hungarian for-profit and non-profit organizations. This paper introduces the PSEM and highlights the most important elements of the empirical analysis.
Resumo:
Aim The aim of this study is to explore based on internationally recognised frameworks: 1. how internal control structures are applied in Sweden among different sectors; 2. how organizational size and environment affect internal control structures; and 3. the impact of internal control structures on organizational performance. Methods A quantitative method was used in the data collection and analysis. The sample consisted of 1117 organizations operating in Sweden. A mean analysis was conducted to measure the level of internal control structures among different industries, organizational sizes, and different choices of listing in the stock exchange market. Person’s correlation analysis was then used to explore possible correlations between external environmental factors and internal control structures, and internal control structures and organizational performance. Lastly, a structural model was built to measure the impact of internal control structures on organizational performance. The measurements of internal control structures and organizational performance are based on COSO framework’s principles and objectives. Results This study gives an insight on how internal control structures are applied across industrial sectors in Sweden, with financial institutions and manufacturing organizations having notably higher levels of internal control structures. Additionally, it provides evidence of the impact external environmental factors have on internal control structures. Furthermore, it shows that organizations that are listed in the Swedish stock exchange market have an equivalent level of internal control structures to those registered in the American stock exchange market. In contrast, organisations that are not listed in the stock exchange market have a notably lower level of internal control structures. Lastly, it illustrates the positive impact the presence of internal control structures has on organizational performance. 3 | P a g e Conclusion The results highlight a crucial role the supervisory authority Finansinspektionen (FI) has in regulating the Swedish financial market. They also show that the stability of the Swedish business environment has had a positive impact on the level of internal control structures.
Resumo:
Conventional taught learning practices often experience difficulties in keeping students motivated and engaged. Video games, however, are very successful at sustaining high levels of motivation and engagement through a set of tasks for hours without apparent loss of focus. In addition, gamers solve complex problems within a gaming environment without feeling fatigue or frustration, as they would typically do with a comparable learning task. Based on this notion, the academic community is keen on exploring methods that can deliver deep learner engagement and has shown increased interest in adopting gamification – the integration of gaming elements, mechanics, and frameworks into non-game situations and scenarios – as a means to increase student engagement and improve information retention. Its effectiveness when applied to education has been debatable though, as attempts have generally been restricted to one-dimensional approaches such as transposing a trivial reward system onto existing teaching materials and/or assessments. Nevertheless, a gamified, multi-dimensional, problem-based learning approach can yield improved results even when applied to a very complex and traditionally dry task like the teaching of computer programming, as shown in this paper. The presented quasi-experimental study used a combination of instructor feedback, real time sequence of scored quizzes, and live coding to deliver a fully interactive learning experience. More specifically, the “Kahoot!” Classroom Response System (CRS), the classroom version of the TV game show “Who Wants To Be A Millionaire?”, and Codecademy’s interactive platform formed the basis for a learning model which was applied to an entry-level Python programming course. Students were thus allowed to experience multiple interlocking methods similar to those commonly found in a top quality game experience. To assess gamification’s impact on learning, empirical data from the gamified group were compared to those from a control group who was taught through a traditional learning approach, similar to the one which had been used during previous cohorts. Despite this being a relatively small-scale study, the results and findings for a number of key metrics, including attendance, downloading of course material, and final grades, were encouraging and proved that the gamified approach was motivating and enriching for both students and instructors.
Resumo:
Business Process Management (BPM) is able to organize and frame a company focusing in the improvement or assurance of performance in order to gain competitive advantage. Although it is believed that BPM improves various aspects of organizational performance, there has been a lack of empirical evidence about this. The present study has the purpose to develop a model to show the impact of business process management in organizational performance. To accomplish that, the theoretical basis required to know the elements that configurate BPM and the measures that can evaluate the BPM success on organizational performance is built through a systematic literature review (SLR). Then, a research model is proposed according to SLR results. Empirical data will be collected from a survey of larg and mid-sized industrial and service companies headquartered in Brazil. A quantitative analysis will be performed using structural equation modeling (SEM) to show if the direct effects among BPM and organizational performance can be considered statistically significant. At the end will discuss these results and their managerial and cientific implications.Keywords: Business process management (BPM). Organizational performance. Firm performance. Business models. Structural Equation Modeling. Systematic Literature Review.
Resumo:
This thesis studies, in collaboration with a Finnish logistics service company, gainsharing and the development of a gainsharing models in a logistics outsourcing context. The purpose of the study is to create various gainsharing model variations for the use of a service provider and its customers in order to develop and enhance the customer’s processes and operations, create savings and improve the collaboration between the companies. The study concentrates on offering gainsharing model alternatives for companies operating in internal logistics outsourcing context. Additionally, the prerequisites for the gainsharing arrangement are introduced. In the beginning of the study an extensive literature review is conducted. There are three main themes explored which are the collaboration in an outsourcing context, key account management and gainsharing philosophy. The customer expectations and experiences are gathered by interviewing case company’s employees and its key customers. In order to design the gainsharing model prototypes, customers and other experts’ knowledge and experiences are utilized. The result of this thesis is five gainsharing model variations that are based on the empirical and theoretical data. In addition, the instructions related to each created model are given to the case company, but are not available in this paper
Resumo:
Hydroxyl radical (OH) is the primary oxidant in the troposphere, initiating the removal of numerous atmospheric species including greenhouse gases, pollutants that are detrimental to human health, and ozone-depleting substances. Because of the complexity of OH chemistry, models vary widely in their OH chemistry schemes and resulting methane (CH4) lifetimes. The current state of knowledge concerning global OH abundances is often contradictory. This body of work encompasses three projects that investigate tropospheric OH from a modeling perspective, with the goal of improving the tropospheric community’s knowledge of the atmospheric lifetime of CH4. First, measurements taken during the airborne CONvective TRansport of Active Species in the Tropics (CONTRAST) field campaign are used to evaluate OH in global models. A box model constrained to measured variables is utilized to infer concentrations of OH along the flight track. Results are used to evaluate global model performance, suggest against the existence of a proposed “OH Hole” in the tropical Western Pacific, and investigate implications of high O3/low H2O filaments on chemical transport to the stratosphere. While methyl chloroform-based estimates of global mean OH suggest that models are overestimating OH, we report evidence that these models are actually underestimating OH in the tropical Western Pacific. The second project examines OH within global models to diagnose differences in CH4 lifetime. I developed an approach to quantify the roles of OH precursor field differences (O3, H2O, CO, NOx, etc.) using a neural network method. This technique enables us to approximate the change in CH4 lifetime resulting from variations in individual precursor fields. The dominant factors driving CH4 lifetime differences between models are O3, CO, and J(O3-O1D). My third project evaluates the effect of climate change on global fields of OH using an empirical model. Observations of H2O and O3 from satellite instruments are combined with a simulation of tropical expansion to derive changes in global mean OH over the past 25 years. We find that increasing H2O and increasing width of the tropics tend to increase global mean OH, countering the increasing CH4 sink and resulting in well-buffered global tropospheric OH concentrations.
Resumo:
The financial crisis of 2007-2008 led to extraordinary government intervention in firms and markets. The scope and depth of government action rivaled that of the Great Depression. Many traded markets experienced dramatic declines in liquidity leading to the existence of conditions normally assumed to be promptly removed via the actions of profit seeking arbitrageurs. These extreme events motivate the three essays in this work. The first essay seeks and fails to find evidence of investor behavior consistent with the broad 'Too Big To Fail' policies enacted during the crisis by government agents. Only in limited circumstances, where government guarantees such as deposit insurance or U.S. Treasury lending lines already existed, did investors impart a premium to the debt security prices of firms under stress. The second essay introduces the Inflation Indexed Swap Basis (IIS Basis) in examining the large differences between cash and derivative markets based upon future U.S. inflation as measured by the Consumer Price Index (CPI). It reports the consistent positive value of this measure as well as the very large positive values it reached in the fourth quarter of 2008 after Lehman Brothers went bankrupt. It concludes that the IIS Basis continues to exist due to limitations in market liquidity and hedging alternatives. The third essay explores the methodology of performing debt based event studies utilizing credit default swaps (CDS). It provides practical implementation advice to researchers to address limited source data and/or small target firm sample size.
Resumo:
The Neolithic was marked by a transition from small and relatively egalitarian groups, to much larger groups with increased stratification. But the dynamics of this remain poorly understood. It is hard to see how despotism can arise without coercion, yet coercion could not easily have occurred in an egalitarian setting. Using a quanti- tative model of evolution in a patch-structured population, we demonstrate that the interaction between demographic and ecological factors can overcome this conundrum. We model the co-evolution of individual preferences for hierarchy alongside the degree of despotism of leaders, and the dispersal preferences of followers. We show that voluntary leadership without coercion can evolve in small groups, when leaders help to solve coordination problems related to resource production. An example is coordinating construction of an irrigation system. Our model predicts that the transition to larger despotic groups will then occur when: 1. surplus resources lead to demographic expansion of groups, removing the viability of an acephalous niche in the same area and so locking individuals into hierarchy; 2. high dispersal costs limit followers' ability to escape a despot. Empirical evidence suggests that these conditions were likely met for the first time during the subsistence intensification of the Neolithic.
Resumo:
We describe a one-step bio-refinery process for shrimp composites by-products. Its originality lies in a simple rapid (6 h) biotechnological cuticle fragmentation process that recovers all major compounds (chitins, peptides and minerals in particular calcium). The process consists of a controlled exogenous enzymatic proteolysis in a food-grade acidic medium allowing chitin purification (solid phase), and recovery of peptides and minerals (liquid phase). At a pH of between 3.5 and 4, protease activity is effective, and peptides are preserved. Solid phase demineralization kinetics were followed for phosphoric, hydrochloric, acetic, formic and citric acids with pKa ranging from 2.1 to 4.76. Formic acid met the initial aim of (i) 99 % of demineralization yield and (ii) 95 % deproteinization yield at a pH close to 3.5 and a molar ratio of 1.5. The proposed one-step process is proven to be efficient. To formalize the necessary elements for the future optimization of the process, two models to predict shell demineralization kinetics were studied, one based on simplified physical considerations and a second empirical one. The first model did not accurately describe the kinetics for times exceeding 30 minutes, the empirical one performed adequately.
Resumo:
A self-organising model of macadamia, expressed using L-Systems, was used to explore aspects of canopy management. A small set of parameters control the basic architecture of the model, with a high degree of self-organisation occurring to determine the fate and growth of buds. Light was sensed at the leaf level and used to represent vigour and accumulated basipetally. Buds also sensed light so as to provide demand in the subsequent redistribution of the vigour. Empirical relationships were derived from a set of 24 completely digitised trees after conversion to multiscale tree graphs (MTG) and analysis with the OpenAlea software library. The ability to write MTG files was embedded within the model so that various tree statistics could be exported for each run of the model. To explore the parameter space a series of runs was completed using a high-throughput computing platform. When combined with MTG generation and analysis with OpenAlea it provided a convenient way in which thousands of simulations could be explored. We allowed the model trees to develop using self-organisation and simulated cultural practices such as hedging, topping, removal of the leader and limb removal within a small representation of an orchard. The model provides insight into the impact of these practices on potential for growth and the light distribution within the canopy and to the orchard floor by coupling the model with a path-tracing program to simulate the light environment. The lessons learnt from this will be applied to other evergreen, tropical fruit and nut trees.
Resumo:
This research explores the business model (BM) evolution process of entrepreneurial companies and investigates the relationship between BM evolution and firm performance. Recently, it has been increasingly recognised that the innovative design (and re-design) of BMs is crucial to the performance of entrepreneurial firms, as BM can be associated with superior value creation and competitive advantage. However, there has been limited theoretical and empirical evidence in relation to the micro-mechanisms behind the BM evolution process and the entrepreneurial outcomes of BM evolution. This research seeks to fill this gap by opening up the ‘black box’ of the BM evolution process, exploring the micro-patterns that facilitate the continuous shaping, changing, and renewing of BMs and examining how BM evolutions create and capture value in a dynamic manner. Drawing together the BM and strategic entrepreneurship literature, this research seeks to understand: (1) how and why companies introduce BM innovations and imitations; (2) how BM innovations and imitations interplay as patterns in the BM evolution process; and (3) how BM evolution patterns affect firm performances. This research adopts a longitudinal multiple case study design that focuses on the emerging phenomenon of BM evolution. Twelve entrepreneurial firms in the Chinese Online Group Buying (OGB) industry were selected for their continuous and intensive developments of BMs and their varying success rates in this highly competitive market. Two rounds of data collection were carried out between 2013 and 2014, which generates 31 interviews with founders/co-founders and in total 5,034 pages of data. Following a three-stage research framework, the data analysis begins by mapping the BM evolution process of the twelve companies and classifying the changes in the BMs into innovations and imitations. The second stage focuses down to the BM level, which addresses the BM evolution as a dynamic process by exploring how BM innovations and imitations unfold and interplay over time. The final stage focuses on the firm level, providing theoretical explanations as to the effects of BM evolution patterns on firm performance. This research provides new insights into the nature of BM evolution by elaborating on the missing link between BM dynamics and firm performance. The findings identify four patterns of BM evolution that have different effects on a firm’s short- and long-term performance. This research contributes to the BM literature by presenting what the BM evolution process actually looks like. Moreover, it takes a step towards the process theory of the interplay between BM innovations and imitations, which addresses the role of companies’ actions, and more importantly, reactions to the competitors. Insights are also given into how entrepreneurial companies achieve and sustain value creation and capture by successfully combining the BM evolution patterns. Finally, the findings on BM evolution contributes to the strategic entrepreneurship literature by increasing the understanding of how companies compete in a more dynamic and complex environment. It reveals that, the achievement of superior firm performance is more than a simple question of whether to innovate or imitate, but rather an integration of innovation and imitation strategies over time. This study concludes with a discussion of the findings and their implications for theory and practice.
Resumo:
The main aim of this study was to determine the impact of innovation on productivity in service sector companies — especially those in the hospitality sector — that value the reduction of environmental impact as relevant to the innovation process. We used a structural analysis model based on the one developed by Crépon, Duguet, and Mairesse (1998). This model is known as the CDM model (an acronym of the authors’ surnames). These authors developed seminal studies in the field of the relationships between innovation and productivity (see Griliches 1979; Pakes and Grilliches 1980). The main advantage of the CDM model is its ability to integrate the process of innovation and business productivity from an empirical perspective.