965 resultados para unified growth models
Resumo:
IEEE 802.11 standard has achieved huge success in the past decade and is still under development to provide higher physical data rate and better quality of service (QoS). An important problem for the development and optimization of IEEE 802.11 networks is the modeling of the MAC layer channel access protocol. Although there are already many theoretic analysis for the 802.11 MAC protocol in the literature, most of the models focus on the saturated traffic and assume infinite buffer at the MAC layer. In this paper we develop a unified analytical model for IEEE 802.11 MAC protocol in ad hoc networks. The impacts of channel access parameters, traffic rate and buffer size at the MAC layer are modeled with the assistance of a generalized Markov chain and an M/G/1/K queue model. The performance of throughput, packet delivery delay and dropping probability can be achieved. Extensive simulations show the analytical model is highly accurate. From the analytical model it is shown that for practical buffer configuration (e.g. buffer size larger than one), we can maximize the total throughput and reduce the packet blocking probability (due to limited buffer size) and the average queuing delay to zero by effectively controlling the offered load. The average MAC layer service delay as well as its standard deviation, is also much lower than that in saturated conditions and has an upper bound. It is also observed that the optimal load is very close to the maximum achievable throughput regardless of the number of stations or buffer size. Moreover, the model is scalable for performance analysis of 802.11e in unsaturated conditions and 802.11 ad hoc networks with heterogenous traffic flows. © 2012 KSI.
Resumo:
This article uses a semiparametric smooth coefficient model (SPSCM) to estimate TFP growth and its components (scale and technical change). The SPSCM is derived from a nonparametric specification of the production technology represented by an input distance function (IDF), using a growth formulation. The functional coefficients of the SPSCM come naturally from the model and are fully flexible in the sense that no functional form of the underlying production technology is used to derive them. Another advantage of the SPSCM is that it can estimate bias (input and scale) in technical change in a fully flexible manner. We also used a translog IDF framework to estimate TFP growth components. A panel of U.S. electricity generating plants for the period 1986–1998 is used for this purpose. Comparing estimated TFP growth results from both parametric and semiparametric models against the Divisia TFP growth, we conclude that the SPSCM performs the best in tracking the temporal behavior of TFP growth.
Resumo:
Many foreign investment operations into emerging markets are small, and are likely to have only a limited impact on the local economy. However, host governments often expect transfer of advanced technology from multinational enterprises (MNEs) operating in these markets to local firms by way of inter-firm mobility of skilled labourers. The extent of such transfers would be limited, among other factors, by the size of the pool of skilled labourers that can potentially be mobile between MNEs and local firms. This, in turn, is determined by employment growth at the MNEs. We develop an empirical specification that models this employment growth, by drawing on both the economics and international business literature. This model is then estimated using firm-level data from four emerging markets. We find that wholly owned foreign direct investment operations have higher employment growth, while local industry and institutional characteristics moderate the growth effect. This suggests that policies encouraging foreign investors to set up in form of joint ventures may not actually raise the benefits for the host economy.
Resumo:
Areolae of the crustose lichen Rhizocarpon geographicum (L.) DC., are present on the peripheral prothallus (marginal areolae) and also aggregate to form confluent masses in the centre of the thallus (central areolae). To determine the relationships between these areolae and whether growth of the peripheral prothallus is dependent on the marginal areolae, the density, morphology, and size frequency distributions of marginal areolae were measured in 23 thalli of R. geographicum in north Wales, UK using image analysis (Image J). Size and morphology of central areolae were also studied across the thallus. Marginal areolae were small, punctate, and occurred in clusters scattered over the peripheral prothallus while central areolae were larger and had a lobed structure. The size-class frequency distributions of the marginal and central areolae were fitted by power-law and log-normal models respectively. In 16 out of 23 thalli, central areolae close to the outer edge were larger and had a more complex lobed morphology than those towards the thallus centre. Neither mean width nor radial growth rate (RaGR) of the peripheral prothallus were correlated with density, diameter, or area fraction of marginal areolae. The data suggest central areolae may develop from marginal areolae as follows: (1) marginal areolae develop in clusters at the periphery and fuse to form central areolae, (2) central areolae grow exponentially, and (3) crowding of central areolae results in constriction and fragmentation. In addition, growth of the peripheral prothallus may be unrelated to the marginal areolae. © 2013 Springer Science+Business Media Dordrecht.
Resumo:
Practitioners assess performance of entities in increasingly large and complicated datasets. If non-parametric models, such as Data Envelopment Analysis, were ever considered as simple push-button technologies, this is impossible when many variables are available or when data have to be compiled from several sources. This paper introduces by the 'COOPER-framework' a comprehensive model for carrying out non-parametric projects. The framework consists of six interrelated phases: Concepts and objectives, On structuring data, Operational models, Performance comparison model, Evaluation, and Result and deployment. Each of the phases describes some necessary steps a researcher should examine for a well defined and repeatable analysis. The COOPER-framework provides for the novice analyst guidance, structure and advice for a sound non-parametric analysis. The more experienced analyst benefits from a check list such that important issues are not forgotten. In addition, by the use of a standardized framework non-parametric assessments will be more reliable, more repeatable, more manageable, faster and less costly. © 2010 Elsevier B.V. All rights reserved.
Resumo:
This paper investigates empirically the importance of technological catch-up in explaining productivity growth in a sample of countries since the 1960s. New proxies for a country's absorptive capability—based on data for students studying abroad, telecommunications and publications—are tested in regression models. The results indicate that absorptive capability is a factor in explaining growth, with the most robust finding that countries with relatively high numbers of students studying science or engineering abroad experience faster subsequent growth. However, the paper also indicates that the significance of coefficients varies across specifications and samples, suggesting caution in focusing on individual results.
Resumo:
Preeclampsia is a hypertensive disorder of pregnancy caused by abnormal placental function, partly because of chronic hypoxia at the utero-placental junction. The increase in levels of soluble vascular endothelial growth factor receptor 1, an antiangiogenic agent known to inhibit placental vascularization, is an important cellular factor implicated in the onset of preeclampsia. We investigated the ligand urotensin II (U-II), a potent endogenous vasoconstrictor and proangiogenic agent, for which levels have been reported to increase in patients with preeclampsia. We hypothesized that an increased sensitivity to U-II in preeclampsia might be achieved by upregulation of placental U-II receptors. We further investigated the role of U-II receptor stimulation on soluble vascular endothelial growth factor receptor 1 release in placental explants from diseased and normal patients. Immunohistochemistry, real-time PCR, and Western blotting analysis revealed that U-II receptor expression was significantly upregulated in preeclampsia placentas compared with controls (P<0.01). Cellular models of syncytiotrophoblast and vascular endothelial cells subjected to hypoxic conditions revealed an increase in U-II receptor levels in the syncytiotrophoblast model. This induction is regulated by the transcriptional activator hypoxia-inducible factor 1a. U-II treatment is associated with increased secretion of soluble vascular endothelial growth factor receptor 1 only in preeclamptic placental explants under hypoxia but not in control conditions. Interestingly, normal placental explants did not respond to U-II stimulation.
Resumo:
Understanding the process of economic growth has been called the ultimate objective of economics. It has also been likened to an elusive quest – like the Holy Grail or the Elixir of Life (Easterly 2001). Taking on such a quest requires ingenuity and perseverance. Even small insights along the way can have major benefits to millions of people; small mistakes can do the reverse. Economies which achieve large increases in output over extended periods of time, not only enable rapid increases in standards of living, but also have dramatic changes in the economic, political and social landscape. For example, the USA is estimated to produce approximately 30 times as much in 1999 as it did in 1899. This sustained economic growth means that in 1999 the USA had an average income per capita of US$34 100. In contrast, sub-Saharan Africa had an average income of $490. Understanding these vast income differences, produced over many decades, is the elusive quest. The aim of this survey is to explain how economists try to understand the process of economic growth. To make the task manageable, the focus is on major issues and current debates. Models and conceptual frameworks are discussed in section III. Section IV summarises empirical studies, with a particular focus on econometric studies of groups of countries. This is not to say that case studies of single countries are not valuable, but space precludes covering everything. The following section sets out some facts about economic growth and, hopefully, motivates the further effort needed to tackle the theory and econometrics.
Resumo:
The objects of a large-scale gas-transport company (GTC) suggest a complex unified evolutionary approach, which covers basic building concepts, up-to-date technologies, models, methods and means that are used in the phases of design, adoption, maintenance and development of the multilevel automated distributed control systems (ADCS).. As a single methodological basis of the suggested approach three basic Concepts, which contain the basic methodological principles and conceptual provisions on the creation of distributed control systems, were worked out: systems of the lower level (ACS of the technological processes based on up-to-date SCADA), of the middle level (ACS of the operative-dispatch production control based on MES-systems) and of the high level (business process control on the basis of complex automated systems ERP).
Resumo:
Analysis of risk measures associated with price series data movements and its predictions are of strategic importance in the financial markets as well as to policy makers in particular for short- and longterm planning for setting up economic growth targets. For example, oilprice risk-management focuses primarily on when and how an organization can best prevent the costly exposure to price risk. Value-at-Risk (VaR) is the commonly practised instrument to measure risk and is evaluated by analysing the negative/positive tail of the probability distributions of the returns (profit or loss). In modelling applications, least-squares estimation (LSE)-based linear regression models are often employed for modeling and analyzing correlated data. These linear models are optimal and perform relatively well under conditions such as errors following normal or approximately normal distributions, being free of large size outliers and satisfying the Gauss-Markov assumptions. However, often in practical situations, the LSE-based linear regression models fail to provide optimal results, for instance, in non-Gaussian situations especially when the errors follow distributions with fat tails and error terms possess a finite variance. This is the situation in case of risk analysis which involves analyzing tail distributions. Thus, applications of the LSE-based regression models may be questioned for appropriateness and may have limited applicability. We have carried out the risk analysis of Iranian crude oil price data based on the Lp-norm regression models and have noted that the LSE-based models do not always perform the best. We discuss results from the L1, L2 and L∞-norm based linear regression models. ACM Computing Classification System (1998): B.1.2, F.1.3, F.2.3, G.3, J.2.
Resumo:
Colon and pancreatic cancers contribute to 90,000 deaths each year in the USA. These cancers lack targeted therapeutics due to heterogeneity of the disease and multiple causative factors. One important factor that contributes to increased colon and pancreatic cancer risk is gastrin. Gastrin mediates its actions through two G-protein coupled receptors (GPCRs): cholecystokinin receptor A (CCK-A) and CCK-B/gastrin receptor. Previous studies have indicated that colon cancer predominantly expresses CCK-A and responds to CCK-A isoform antagonists. However, many CCK-A antagonists have failed in the clinic due to poor pharmacokinetic properties or lack of efficacy. In the present study, we synthesized a library of CCK-A isoform-selective antagonists and tested them in various colon and pancreatic cancer preclinical models. The lead CCK-A isoform, selective antagonist PNB-028, bound to CCK-A at 12 nM with a 60-fold selectivity towards CCK-A over CCK-B. Furthermore, it inhibited the proliferation of CCK-A-expressing colon and pancreatic cancer cells without affecting the proliferation of non-cancerous cells. PNB-028 was also extremely effective in inhibiting the growth of MAC-16 and LoVo colon cancer and MIA PaCa pancreatic cancer xenografts in immune-compromised mice. Genomewide microarray and kinase-array studies indicate that PNB-028 inhibited oncogenic kinases and angiogenic factors to inhibit the growth of colon cancer xenografts. Safety pharmacology and toxicology studies have indicated that PNB-028 is extremely safe and has a wide safety margin. These studies suggest that targeting CCK-A selectively renders promise to treat colon and pancreatic cancers and that PNB-028 could become the next-generation treatment option.
Resumo:
Climate change has a great impact on the build and the work of natural ecosystems. Disappearance of some population or growth of the number in some species can be already caused by little change in temperature. A Theoretical Ecosystem Growth Model was investigated in order to examine the effects of various climate patterns on the ecological equilibrium. The answers of the ecosystems which are given to the climate change could be described by means of global climate modelling and dynamic vegetation models. The examination of the operation of the ecosystems is only possible in huge centres on supercomputers because of the number and the complexity of the calculation. The number of the calculation could be decreased to the level of a PC by considering the temperature and the reproduction during the modelling of a theoretical ecosystem and several important theoretical questions could be answered.
Resumo:
Ecological models have often been used in order to answer questions that are in the limelight of recent researches such as the possible effects of climate change. The methodology of tactical models is a very useful tool comparison to those complex models requiring relatively large set of input parameters. In this study, a theoretical strategic model (TEGM ) was adapted to the field data on the basis of a 24-year long monitoring database of phytoplankton in the Danube River at the station of G¨od, Hungary (at 1669 river kilometer – hereafter referred to as “rkm”). The Danubian Phytoplankton Growth Model (DPGM) is able to describe the seasonal dynamics of phytoplankton biomass (mg L−1) based on daily temperature, but takes the availability of light into consideration as well. In order to improve fitting, the 24-year long database was split in two parts in accordance with environmental sustainability. The period of 1979–1990 has a higher level of nutrient excess compared with that of the 1991–2002. The authors assume that, in the above-mentioned periods, phytoplankton responded to temperature in two different ways, thus two submodels were developed, DPGM-sA and DPGMsB. Observed and simulated data correlated quite well. Findings suggest that linear temperature rise brings drastic change to phytoplankton only in case of high nutrient load and it is mostly realized through the increase of yearly total biomass.
Resumo:
The development of a new set of frost property measurement techniques to be used in the control of frost growth and defrosting processes in refrigeration systems was investigated. Holographic interferometry and infrared thermometry were used to measure the temperature of the frost-air interface, while a beam element load sensor was used to obtain the weight of a deposited frost layer. The proposed measurement techniques were tested for the cases of natural and forced convection, and the characteristic charts were obtained for a set of operational conditions. ^ An improvement of existing frost growth mathematical models was also investigated. The early stage of frost nucleation was commonly not considered in these models and instead an initial value of layer thickness and porosity was regularly assumed. A nucleation model to obtain the droplet diameter and surface porosity at the end of the early frosting period was developed. The drop-wise early condensation in a cold flat plate under natural convection to a hot (room temperature) and humid air was modeled. A nucleation rate was found, and the relation of heat to mass transfer (Lewis number) was obtained. It was found that the Lewis number was much smaller than unity, which is the standard value usually assumed for most frosting numerical models. The nucleation model was validated against available experimental data for the early nucleation and full growth stages of the frosting process. ^ The combination of frost top temperature and weight variation signals can now be used to control the defrosting timing and the developed early nucleation model can now be used to simulate the entire process of frost growth in any surface material. ^
A framework for transforming, analyzing, and realizing software designs in unified modeling language
Resumo:
Unified Modeling Language (UML) is the most comprehensive and widely accepted object-oriented modeling language due to its multi-paradigm modeling capabilities and easy to use graphical notations, with strong international organizational support and industrial production quality tool support. However, there is a lack of precise definition of the semantics of individual UML notations as well as the relationships among multiple UML models, which often introduces incomplete and inconsistent problems for software designs in UML, especially for complex systems. Furthermore, there is a lack of methodologies to ensure a correct implementation from a given UML design. The purpose of this investigation is to verify and validate software designs in UML, and to provide dependability assurance for the realization of a UML design.^ In my research, an approach is proposed to transform UML diagrams into a semantic domain, which is a formal component-based framework. The framework I proposed consists of components and interactions through message passing, which are modeled by two-layer algebraic high-level nets and transformation rules respectively. In the transformation approach, class diagrams, state machine diagrams and activity diagrams are transformed into component models, and transformation rules are extracted from interaction diagrams. By applying transformation rules to component models, a (sub)system model of one or more scenarios can be constructed. Various techniques such as model checking, Petri net analysis techniques can be adopted to check if UML designs are complete or consistent. A new component called property parser was developed and merged into the tool SAM Parser, which realize (sub)system models automatically. The property parser generates and weaves runtime monitoring code into system implementations automatically for dependability assurance. The framework in the investigation is creative and flexible since it not only can be explored to verify and validate UML designs, but also provides an approach to build models for various scenarios. As a result of my research, several kinds of previous ignored behavioral inconsistencies can be detected.^