926 resultados para Business Value Two-Layer Model
Resumo:
It is well known that cointegration between the level of two variables (e.g. prices and dividends) is a necessary condition to assess the empirical validity of a present-value model (PVM) linking them. The work on cointegration,namelyon long-run co-movements, has been so prevalent that it is often over-looked that another necessary condition for the PVM to hold is that the forecast error entailed by the model is orthogonal to the past. This amounts to investigate whether short-run co-movememts steming from common cyclical feature restrictions are also present in such a system. In this paper we test for the presence of such co-movement on long- and short-term interest rates and on price and dividend for the U.S. economy. We focuss on the potential improvement in forecasting accuracies when imposing those two types of restrictions coming from economic theory.
Resumo:
This paper has two original contributions. First, we show that the present value model (PVM hereafter), which has a wide application in macroeconomics and fi nance, entails common cyclical feature restrictions in the dynamics of the vector error-correction representation (Vahid and Engle, 1993); something that has been already investigated in that VECM context by Johansen and Swensen (1999, 2011) but has not been discussed before with this new emphasis. We also provide the present value reduced rank constraints to be tested within the log-linear model. Our second contribution relates to forecasting time series that are subject to those long and short-run reduced rank restrictions. The reason why appropriate common cyclical feature restrictions might improve forecasting is because it finds natural exclusion restrictions preventing the estimation of useless parameters, which would otherwise contribute to the increase of forecast variance with no expected reduction in bias. We applied the techniques discussed in this paper to data known to be subject to present value restrictions, i.e. the online series maintained and up-dated by Shiller. We focus on three different data sets. The fi rst includes the levels of interest rates with long and short maturities, the second includes the level of real price and dividend for the S&P composite index, and the third includes the logarithmic transformation of prices and dividends. Our exhaustive investigation of several different multivariate models reveals that better forecasts can be achieved when restrictions are applied to them. Moreover, imposing short-run restrictions produce forecast winners 70% of the time for target variables of PVMs and 63.33% of the time when all variables in the system are considered.
Resumo:
The following paper was conducted with the support of several entrepreneurs and startups from Brazil. The aim of the research was to find out which impact the Business Model Canvas, further abbreviated as BMC, has on technology-oriented startups in Brazil. The first step of the study was identify some general concepts of entrepreneurship, as well as the conditions and environment of the country. Afterwards, it was focused on defining and comparing different business model tools and concepts to the BMC. After the literature review and meeting with several professionals in the area of entrepreneurship and startups, a questionnaire was formulated in order to conduct the qualitative study and identify the main impact of the tool. The questionnaire was answered by ten startups. In order to check the validity and credibility of the research outcomes, theory and investigator triangulation was used. As a result, the usage of the BMC could be evaluated by obtaining the outcomes and the theory, which showed that Brazilian tech startups are using Osterwalder’s model for the reason of idea creation and testing, validating and pivoting their business model. Interestingly, the research revealed that the entrepreneurs are using the tool often not in the traditional way of printing it, but rather applying it as a thinking approach. Besides, the entrepreneurs are focusing mostly on developing a strong Value Proposition, Customer Segment and sustainable Revenue Streams, while afterwards the remaining building blocks are built. Moreover, the research showed that the startups are using also other concepts, such as the Customer Development Process or Build-Measure-Learn Feedback Loop. These methodologies are often applied together with the BMC and helps to identify the most sustainable components of the business idea. Keywords: Business
Resumo:
When there is a failure on the external sheath of a flexible pipe, a high value of hydrostatic pressure is transferred to its internal plastic layer and consequently to its interlocked carcass, leading to the possibility of collapse. The design of a flexible pipe must predict the maximum value of external pressure the carcass layer can be subjected to without collapse. This value depends on the initial ovalization due to manufacturing tolerances. To study that problem, two numerical finite element models were developed to simulate the behavior of the carcass subjected to external pressure, including the plastic behavior of the materials. The first one is a full 3D model and the second one is a 3D ring model, both composed by solid elements. An interesting conclusion is that both the models provide the same results. An analytical model using an equivalent thickness approach for the carcass layer was also constructed. A good correlation between analytical and numerical models was achieved for pre-collapse behavior but the collapse pressure value and post-collapse behavior were not well predicted by the analytical model. [DOI: 10.1115/1.4005185]
Resumo:
Effects of roads on wildlife and its habitat have been measured using metrics, such as the nearest road distance, road density, and effective mesh size. In this work we introduce two new indices: (1) Integral Road Effect (IRE), which measured the sum effects of points in a road at a fixed point in the forest; and (2) Average Value of the Infinitesimal Road Effect (AVIRE), which measured the average of the effects of roads at this point. IRE is formally defined as the line integral of a special function (the infinitesimal road effect) along the curves that model the roads, whereas AVIRE is the quotient of IRE by the length of the roads. Combining tools of ArcGIS software with a numerical algorithm, we calculated these and other road and habitat cover indices in a sample of points in a human-modified landscape in the Brazilian Atlantic Forest, where data on the abundance of two groups of small mammals (forest specialists and habitat generalists) were collected in the field. We then compared through the Akaike Information Criterion (AIC) a set of candidate regression models to explain the variation in small mammal abundance, including models with our two new road indices (AVIRE and IRE) or models with other road effect indices (nearest road distance, mesh size, and road density), and reference models (containing only habitat indices, or only the intercept without the effect of any variable). Compared to other road effect indices, AVIRE showed the best performance to explain abundance of forest specialist species, whereas the nearest road distance obtained the best performance to generalist species. AVIRE and habitat together were included in the best model for both small mammal groups, that is, higher abundance of specialist and generalist small mammals occurred where there is lower average road effect (less AVIRE) and more habitat. Moreover, AVIRE was not significantly correlated with habitat cover of specialists and generalists differing from the other road effect indices, except mesh size, which allows for separating the effect of roads from the effect of habitat on small mammal communities. We suggest that the proposed indices and GIS procedures could also be useful to describe other spatial ecological phenomena, such as edge effect in habitat fragments. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
In this work we have investigated the intercalation of electron-donors between NbS2 slabs in Nb-based layer sulfides. Two series of Sr substituted Nb-based misfit sulfides belonging to the 1.5Q/1H and 1Q/1H series of misfit layer compounds have been synthesised. For large lanthanides (Ln=La, Ce), only the 1Q/1H compounds formed whereas for smaller lanthanides and yttrium, both types of phases can be obtained. The crystal structure of misfit sulfide (Pr0.55Sr0.45S)1.15NbS2 has been refined using the composite approach. In the Q-slab, Pr-atoms are partly replaced by Sr with a random distribution over one cation position. The crystal structure of misfit sulfide [(Sm1/3Sr2/3S)1.5]1.15NbS2 belonging to the 1.5Q/1H series have also been determined. The obtained results suggest a preferred occupancy of the cation positions in the slab where Sr atoms mainly occupy positions on the exterior of the slab while Sm atoms are in the center of the slab. The (La1-xSrxS)1.15NbS2 solid solution (0.1<x<0.9) has also been studied. It was found that the maximum value of Sr substitution is 40-50% and therefore, the minimal value of charge transfer to stabilize this structure type is about 0.6ē per Nb atom. An attempt to synthesize SrxNbS2 (0.1≤x≤0.5) intercalates was made but single phases were not obtained and increasing the temperature from 1000оС to 1100оС leads to the decomposition of these intercalates. Single crystals of Sr0.22Nb1.05S2 and Sr0.23NbS2 were found and their structures were determined. The structures belong to two different types of packings with statistical distribution of Sr between layers. A new superconducting sulfide, "EuNb2S5", was investigated by ED and HREM and its structure model consisting of Nb7S14 and (Eu3S4)2 slabs alternating along the c-axis is suggested. An attempt to suggest a model for the structure of "SrNb2S5" by means of X-ray single crystal diffraction was made. The proposed structure consists of two types of slabs: a Nb7S14 and a [Sr6(NbS4)2S] slab with niobium in tetrahedral coordination. It is shown that "SrNb2S5" and "EuNb2S5" are have similar structures. For the first time, single crystals of the complex sulfide BaNb0.9S3 have also been studied by means of X-ray single crystal diffraction. The single crystal refinement and EDX analysis showed the existence of cation vacancies at the niobium position. BaNb0.9S3 has also been studied by ED and no superstructure was found which implies that and the vacancies are statistically distributed. No improvement of the magnetic properties of the studied compounds was observed in comparison to NbS2.
Resumo:
In electronic commerce, systems development is based on two fundamental types of models, business models and process models. A business model is concerned with value exchanges among business partners, while a process model focuses on operational and procedural aspects of business communication. Thus, a business model defines the what in an e-commerce system, while a process model defines the how. Business process design can be facilitated and improved by a method for systematically moving from a business model to a process model. Such a method would provide support for traceability, evaluation of design alternatives, and seamless transition from analysis to realization. This work proposes a unified framework that can be used as a basis to analyze, to interpret and to understand different concepts associated at different stages in e-Commerce system development. In this thesis, we illustrate how UN/CEFACT’s recommended metamodels for business and process design can be analyzed, extended and then integrated for the final solutions based on the proposed unified framework. Also, as an application of the framework, we demonstrate how process-modeling tasks can be facilitated in e-Commerce system design. The proposed methodology, called BP3 stands for Business Process Patterns Perspective. The BP3 methodology uses a question-answer interface to capture different business requirements from the designers. It is based on pre-defined process patterns, and the final solution is generated by applying the captured business requirements by means of a set of production rules to complete the inter-process communication among these patterns.
Resumo:
The ability of the one-dimensional lake model FLake to represent the mixolimnion temperatures for tropical conditions was tested for three locations in East Africa: Lake Kivu and Lake Tanganyika's northern and southern basins. Meteorological observations from surrounding automatic weather stations were corrected and used to drive FLake, whereas a comprehensive set of water temperature profiles served to evaluate the model at each site. Careful forcing data correction and model configuration made it possible to reproduce the observed mixed layer seasonality at Lake Kivu and Lake Tanganyika (northern and southern basins), with correct representation of both the mixed layer depth and water temperatures. At Lake Kivu, mixolimnion temperatures predicted by FLake were found to be sensitive both to minimal variations in the external parameters and to small changes in the meteorological driving data, in particular wind velocity. In each case, small modifications may lead to a regime switch, from the correctly represented seasonal mixed layer deepening to either completely mixed or permanently stratified conditions from similar to 10 m downwards. In contrast, model temperatures were found to be robust close to the surface, with acceptable predictions of near-surface water temperatures even when the seasonal mixing regime is not reproduced. FLake can thus be a suitable tool to parameterise tropical lake water surface temperatures within atmospheric prediction models. Finally, FLake was used to attribute the seasonal mixing cycle at Lake Kivu to variations in the near-surface meteorological conditions. It was found that the annual mixing down to 60m during the main dry season is primarily due to enhanced lake evaporation and secondarily to the decreased incoming long wave radiation, both causing a significant heat loss from the lake surface and associated mixolimnion cooling.
Resumo:
A basin-wide interdecadal change in both the physical state and the ecology of the North Pacific occurred near the end of 1976. Here we use a physical-ecosystem model to examine whether changes in the physical environment associated with the 1976-1977 transition influenced the lower trophic levels of the food web and if so by what means. The physical component is an ocean general circulation model, while the biological component contains 10 compartments: two phytoplankton, two zooplankton, two detritus pools, nitrate, ammonium, silicate, and carbon dioxide. The model is forced with observed atmospheric fields during 1960-1999. During spring, there is a similar to 40% reduction in plankton biomass in all four plankton groups during 1977-1988 relative to 1970-1976 in the central Gulf of Alaska (GOA). The epoch difference in plankton appears to be controlled by the mixed layer depth. Enhanced Ekman pumping after 1976 caused the halocline to shoal, and thus the mixed layer depth, which extends to the top of the halocline in late winter, did not penetrate as deep in the central GOA. As a result, more phytoplankton remained in the euphotic zone, and phytoplankton biomass began to increase earlier in the year after the 1976 transition. Zooplankton biomass also increased, but then grazing pressure led to a strong decrease in phytoplankton by April followed by a drop in zooplankton by May: Essentially, the mean seasonal cycle of plankton biomass was shifted earlier in the year. As the seasonal cycle progressed, the difference in plankton concentrations between epochs reversed sign again, leading to slightly greater zooplankton biomass during summer in the later epoch.
Impact of epinephrine and norepinephrine on two dynamic indices in a porcine hemorrhagic shock model
Resumo:
Abstract BACKGROUND: Pulse pressure variations (PPVs) and stroke volume variations (SVVs) are dynamic indices for predicting fluid responsiveness in intensive care unit patients. These hemodynamic markers underscore Frank-Starling law by which volume expansion increases cardiac output (CO). The aim of the present study was to evaluate the impact of the administration of catecholamines on PPV, SVV, and inferior vena cava flow (IVCF). METHODS: In this prospective, physiologic, animal study, hemodynamic parameters were measured in deeply sedated and mechanically ventilated pigs. Systemic hemodynamic and pressure-volume loops obtained by inferior vena cava occlusion were recorded. Measurements were collected during two conditions, that is, normovolemia and hypovolemia, generated by blood removal to obtain a mean arterial pressure value lower than 60 mm Hg. At each condition, CO, IVCF, SVV, and PPV were assessed by catheters and flow meters. Data were compared between the conditions normovolemia and hypovolemia before and after intravenous administrations of norepinephrine and epinephrine using a nonparametric Wilcoxon test. RESULTS: Eight pigs were anesthetized, mechanically ventilated, and equipped. Both norepinephrine and epinephrine significantly increased IVCF and decreased PPV and SVV, regardless of volemic conditions (p < 0.05). However, epinephrine was also able to significantly increase CO regardless of volemic conditions. CONCLUSION: The present study demonstrates that intravenous administrations of norepinephrine and epinephrine increase IVCF, whatever the volemic conditions are. The concomitant decreases in PPV and SVV corroborate the fact that catecholamine administration recruits unstressed blood volume. In this regard, understanding a decrease in PPV and SVV values, after catecholamine administration, as an obvious indication of a restored volemia could be an outright misinterpretation.
Resumo:
The geometries of a catchment constitute the basis for distributed physically based numerical modeling of different geoscientific disciplines. In this paper results from ground-penetrating radar (GPR) measurements, in terms of a 3D model of total sediment thickness and active layer thickness in a periglacial catchment in western Greenland, is presented. Using the topography, thickness and distribution of sediments is calculated. Vegetation classification and GPR measurements are used to scale active layer thickness from local measurements to catchment scale models. Annual maximum active layer thickness varies from 0.3 m in wetlands to 2.0 m in barren areas and areas of exposed bedrock. Maximum sediment thickness is estimated to be 12.3 m in the major valleys of the catchment. A method to correlate surface vegetation with active layer thickness is also presented. By using relatively simple methods, such as probing and vegetation classification, it is possible to upscale local point measurements to catchment scale models, in areas where the upper subsurface is relatively homogenous. The resulting spatial model of active layer thickness can be used in combination with the sediment model as a geometrical input to further studies of subsurface mass-transport and hydrological flow paths in the periglacial catchment through numerical modelling.
Resumo:
OntoTag - A Linguistic and Ontological Annotation Model Suitable for the Semantic Web
1. INTRODUCTION. LINGUISTIC TOOLS AND ANNOTATIONS: THEIR LIGHTS AND SHADOWS
Computational Linguistics is already a consolidated research area. It builds upon the results of other two major ones, namely Linguistics and Computer Science and Engineering, and it aims at developing computational models of human language (or natural language, as it is termed in this area). Possibly, its most well-known applications are the different tools developed so far for processing human language, such as machine translation systems and speech recognizers or dictation programs.
These tools for processing human language are commonly referred to as linguistic tools. Apart from the examples mentioned above, there are also other types of linguistic tools that perhaps are not so well-known, but on which most of the other applications of Computational Linguistics are built. These other types of linguistic tools comprise POS taggers, natural language parsers and semantic taggers, amongst others. All of them can be termed linguistic annotation tools.
Linguistic annotation tools are important assets. In fact, POS and semantic taggers (and, to a lesser extent, also natural language parsers) have become critical resources for the computer applications that process natural language. Hence, any computer application that has to analyse a text automatically and ‘intelligently’ will include at least a module for POS tagging. The more an application needs to ‘understand’ the meaning of the text it processes, the more linguistic tools and/or modules it will incorporate and integrate.
However, linguistic annotation tools have still some limitations, which can be summarised as follows:
1. Normally, they perform annotations only at a certain linguistic level (that is, Morphology, Syntax, Semantics, etc.).
2. They usually introduce a certain rate of errors and ambiguities when tagging. This error rate ranges from 10 percent up to 50 percent of the units annotated for unrestricted, general texts.
3. Their annotations are most frequently formulated in terms of an annotation schema designed and implemented ad hoc.
A priori, it seems that the interoperation and the integration of several linguistic tools into an appropriate software architecture could most likely solve the limitations stated in (1). Besides, integrating several linguistic annotation tools and making them interoperate could also minimise the limitation stated in (2). Nevertheless, in the latter case, all these tools should produce annotations for a common level, which would have to be combined in order to correct their corresponding errors and inaccuracies. Yet, the limitation stated in (3) prevents both types of integration and interoperation from being easily achieved.
In addition, most high-level annotation tools rely on other lower-level annotation tools and their outputs to generate their own ones. For example, sense-tagging tools (operating at the semantic level) often use POS taggers (operating at a lower level, i.e., the morphosyntactic) to identify the grammatical category of the word or lexical unit they are annotating. Accordingly, if a faulty or inaccurate low-level annotation tool is to be used by other higher-level one in its process, the errors and inaccuracies of the former should be minimised in advance. Otherwise, these errors and inaccuracies would be transferred to (and even magnified in) the annotations of the high-level annotation tool.
Therefore, it would be quite useful to find a way to
(i) correct or, at least, reduce the errors and the inaccuracies of lower-level linguistic tools;
(ii) unify the annotation schemas of different linguistic annotation tools or, more generally speaking, make these tools (as well as their annotations) interoperate.
Clearly, solving (i) and (ii) should ease the automatic annotation of web pages by means of linguistic tools, and their transformation into Semantic Web pages (Berners-Lee, Hendler and Lassila, 2001). Yet, as stated above, (ii) is a type of interoperability problem. There again, ontologies (Gruber, 1993; Borst, 1997) have been successfully applied thus far to solve several interoperability problems. Hence, ontologies should help solve also the problems and limitations of linguistic annotation tools aforementioned.
Thus, to summarise, the main aim of the present work was to combine somehow these separated approaches, mechanisms and tools for annotation from Linguistics and Ontological Engineering (and the Semantic Web) in a sort of hybrid (linguistic and ontological) annotation model, suitable for both areas. This hybrid (semantic) annotation model should (a) benefit from the advances, models, techniques, mechanisms and tools of these two areas; (b) minimise (and even solve, when possible) some of the problems found in each of them; and (c) be suitable for the Semantic Web. The concrete goals that helped attain this aim are presented in the following section.
2. GOALS OF THE PRESENT WORK
As mentioned above, the main goal of this work was to specify a hybrid (that is, linguistically-motivated and ontology-based) model of annotation suitable for the Semantic Web (i.e. it had to produce a semantic annotation of web page contents). This entailed that the tags included in the annotations of the model had to (1) represent linguistic concepts (or linguistic categories, as they are termed in ISO/DCR (2008)), in order for this model to be linguistically-motivated; (2) be ontological terms (i.e., use an ontological vocabulary), in order for the model to be ontology-based; and (3) be structured (linked) as a collection of ontology-based
Resumo:
Corrosion of reinforcing steel in concrete due to chloride ingress is one of the main causes of the deterioration of reinforced concrete structures. Structures most affected by such a corrosion are marine zone buildings and structures exposed to de-icing salts like highways and bridges. Such process is accompanied by an increase in volume of the corrosión products on the rebarsconcrete interface. Depending on the level of oxidation, iron can expand as much as six times its original volume. This increase in volume exerts tensile stresses in the surrounding concrete which result in cracking and spalling of the concrete cover if the concrete tensile strength is exceeded. The mechanism by which steel embedded in concrete corrodes in presence of chloride is the local breakdown of the passive layer formed in the highly alkaline condition of the concrete. It is assumed that corrosion initiates when a critical chloride content reaches the rebar surface. The mathematical formulation idealized the corrosion sequence as a two-stage process: an initiation stage, during which chloride ions penetrate to the reinforcing steel surface and depassivate it, and a propagation stage, in which active corrosion takes place until cracking of the concrete cover has occurred. The aim of this research is to develop computer tools to evaluate the duration of the service life of reinforced concrete structures, considering both the initiation and propagation periods. Such tools must offer a friendly interface to facilitate its use by the researchers even though their background is not in numerical simulation. For the evaluation of the initiation period different tools have been developed: Program TavProbabilidade: provides means to carry out a probability analysis of a chloride ingress model. Such a tool is necessary due to the lack of data and general uncertainties associated with the phenomenon of the chloride diffusion. It differs from the deterministic approach because it computes not just a chloride profile at a certain age, but a range of chloride profiles for each probability or occurrence. Program TavProbabilidade_Fiabilidade: carries out reliability analyses of the initiation period. It takes into account the critical value of the chloride concentration on the steel that causes breakdown of the passive layer and the beginning of the propagation stage. It differs from the deterministic analysis in that it does not predict if the corrosion is going to begin or not, but to quantifies the probability of corrosion initiation. Program TavDif_1D: was created to do a one dimension deterministic analysis of the chloride diffusion process by the finite element method (FEM) which numerically solves Fick’second Law. Despite of the different FEM solver already developed in one dimension, the decision to create a new code (TavDif_1D) was taken because of the need to have a solver with friendly interface for pre- and post-process according to the need of IETCC. An innovative tool was also developed with a systematic method devised to compare the ability of the different 1D models to predict the actual evolution of chloride ingress based on experimental measurements, and also to quantify the degree of agreement of the models with each others. For the evaluation of the entire service life of the structure: a computer program has been developed using finite elements method to do the coupling of both service life periods: initiation and propagation. The program for 2D (TavDif_2D) allows the complementary use of two external programs in a unique friendly interface: • GMSH - an finite element mesh generator and post-processing viewer • OOFEM – a finite element solver. This program (TavDif_2D) is responsible to decide in each time step when and where to start applying the boundary conditions of fracture mechanics module in function of the amount of chloride concentration and corrosion parameters (Icorr, etc). This program is also responsible to verify the presence and the degree of fracture in each element to send the Information of diffusion coefficient variation with the crack width. • GMSH - an finite element mesh generator and post-processing viewer • OOFEM – a finite element solver. The advantages of the FEM with the interface provided by the tool are: • the flexibility to input the data such as material property and boundary conditions as time dependent function. • the flexibility to predict the chloride concentration profile for different geometries. • the possibility to couple chloride diffusion (initiation stage) with chemical and mechanical behavior (propagation stage). The OOFEM code had to be modified to accept temperature, humidity and the time dependent values for the material properties, which is necessary to adequately describe the environmental variations. A 3-D simulation has been performed to simulate the behavior of the beam on both, action of the external load and the internal load caused by the corrosion products, using elements of imbedded fracture in order to plot the curve of the deflection of the central region of the beam versus the external load to compare with the experimental data.
Resumo:
The demand of new services, the emergence of new business models, insufficient innovation, underestimation of customer loyalty and reluctance to adopt new management are evidence of the deficiencies and the lack of research about the relations between patients and dental clinics. In this article we propose the structure of a model of Relationship Marketing (RM) in the dental clinic that integrates information from SERVQUAL, Customer Loyalty (CL) and activities of RM and combines the vision of dentist and patient. The first pilot study on dentists showed that: they recognize the value of maintaining better patients however they don't perform RM actions to retain them. They have databases of patients but not sophisticated enough as compared to RM tools. They perceive that the patients value "Assurance" and "Empathy" (two dimensions of service quality). Finally, they indicate that a loyal patient not necessarily pays more by the service. The proposed model will be validated using Fuzzy Logic simulation and the ultimate goal of this research line is contributing a new definition of CL.
Resumo:
Today P2P faces two important challenges: design of mechanisms to encourage users' collaboration in multimedia live streaming services; design of reliable algorithms with QoS provision, to encourage the multimedia providers employ the P2P topology in commercial live streaming systems. We believe that these two challenges are tightly-related and there is much to be done with respect. This paper analyzes the effect of user behavior in a multi-tree P2P overlay and describes a business model based on monetary discount as incentive in a P2P-Cloud multimedia streaming system. We believe a discount model can boost up users' cooperation and loyalty and enhance the overall system integrity and performance. Moreover the model bounds the constraints for a provider's revenue and cost if the P2P system is leveraged on a cloud infrastructure. Our case study shows that a streaming system provider can establish or adapt his business model by applying the described bounds to achieve a good discount-revenue trade-off and promote the system to the users.