915 resultados para The Real Failure Rate of Restaurants


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Interpolated data are an important part of the environmental information exchange as many variables can only be measured at situate discrete sampling locations. Spatial interpolation is a complex operation that has traditionally required expert treatment, making automation a serious challenge. This paper presents a few lessons learnt from INTAMAP, a project that is developing an interoperable web processing service (WPS) for the automatic interpolation of environmental data using advanced geostatistics, adopting a Service Oriented Architecture (SOA). The “rainbow box” approach we followed provides access to the functionality at a whole range of different levels. We show here how the integration of open standards, open source and powerful statistical processing capabilities allows us to automate a complex process while offering users a level of access and control that best suits their requirements. This facilitates benchmarking exercises as well as the regular reporting of environmental information without requiring remote users to have specialized skills in geostatistics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increase in the number of financial restatements in recent years has resulted in a significant decrease in the amount of market capitalization for restated companies. Prior literature did not differentiate between single and multiple restatements announcements. This research investigated the inter-relationships among multiple financial restatements, corporate governance, market microstructure and the firm’s rate of return in the form of three essays by differentiating between single and multiple restatement announcement companies. First essay examined the stock performance of companies announcing the financial restatement multiple times. The postulation is that prior research overestimates the abnormal return by not separating single restatement companies from multiple restatement companies. This study investigated how market penalizes the companies that announce restatement more than once. Differentiating the restatement announcement data based on number of restatement announcements, the results supported the non persistence hypothesis that the market has no memory and negative abnormal returns obtained after each of the restatement announcements are completely random. Second essay examined the multiple restatement announcements and its perceived resultant information asymmetry around the announcement day. This study examined the pattern of information asymmetry for these announcements in terms of whether the bid-ask spread widens around the announcement day. The empirical analysis supported the hypotheses that the spread does widen not only around the first restatement announcement day but around every subsequent announcement days as well. The third essay empirically examined the financial and corporate governance characteristics of single and multiple restatement announcements companies. The analysis showed that corporate governance variables influence the occurrence of multiple restatement announcements and can distinguish multiple restatements announcement companies from single restatement announcement companies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increase in the number of financial restatements in recent years has resulted in a significant decrease in the amount of market capitalization for restated companies. Prior literature does not differentiate between single and multiple restatements announcements. This research investigates the inter-relationships among multiple financial restatements, corporate governance, market microstructure and the firm's rate of return in the form of three essays by differentiating between single and multiple restatement announcement companies. First essay examines the stock performance of companies announcing the financial restatement multiple times. The postulation is that prior research overestimates the abnormal return by not separating single restatement companies from multiple restatement companies. This study investigates how market penalizes the companies that announce restatement more than once. Differentiating the restatement announcement data based on number of restatement announcements, the results support for non persistence hypothesis that the market has no memory and negative abnormal returns obtained after each of the restatement announcements are completely random. Second essay examines the multiple restatement announcements and its perceived resultant information asymmetry around the announcement day. This study examines the pattern of information asymmetry for these announcements in terms of whether the bid-ask spread widens around the announcement day. The empirical analysis supports the hypotheses that the spread does widen not only around the first restatement announcement day but around every subsequent announcement days as well. The third essay empirically examines the financial and corporate governance characteristics of single and multiple restatement announcements companies. The analysis shows that corporate governance variables influence the occurrence of multiple restatement announcements and can distinguish multiple restatements announcement companies from single restatement announcement companies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The real-quaternionic indicator, also called the $\delta$ indicator, indicates if a self-conjugate representation is of real or quaternionic type. It is closely related to the Frobenius-Schur indicator, which we call the $\varepsilon$ indicator. The Frobenius-Schur indicator $\varepsilon(\pi)$ is known to be given by a particular value of the central character. We would like a similar result for the $\delta$ indicator. When $G$ is compact, $\delta(\pi)$ and $\varepsilon(\pi)$ coincide. In general, they are not necessarily the same. In this thesis, we will give a relation between the two indicators when $G$ is a real reductive algebraic group. This relation also leads to a formula for $\delta(\pi)$ in terms of the central character. For the second part, we consider the construction of the local Langlands correspondence of $GL(2,F)$ when $F$ is a non-Archimedean local field with odd residual characteristics. By re-examining the construction, we provide new proofs to some important properties of the correspondence. Namely, the construction is independent of the choice of additive character in the theta correspondence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of these notes is to present a simple mathematical model of the determination of current account real exchange rate as defined by Bresser-Pereira (2010); i.e. the real exchange rate that guarantees the inter temporal equilibrium of balance of payments and to show the relation between Real Exchange rate and Productive Specialization at theoretical and empirical level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many large-scale GNSS CORS networks have been deployed around the world to support various commercial and scientific applications. To make use of these networks for real-time kinematic positioning services, one of the major challenges is the ambiguity resolution (AR) over long inter-station baselines in the presence of considerable atmosphere biases. Usually, the widelane ambiguities are fixed first, followed by the procedure of determination of the narrowlane ambiguity integers based on the ionosphere-free model in which the widelane integers are introduced as known quantities. This paper seeks to improve the AR performance over long baseline through efficient procedures for improved float solutions and ambiguity fixing. The contribution is threefold: (1) instead of using the ionosphere-free measurements, the absolute and/or relative ionospheric constraints are introduced in the ionosphere-constrained model to enhance the model strength, thus resulting in the better float solutions; (2) the realistic widelane ambiguity precision is estimated by capturing the multipath effects due to the observation complexity, leading to improvement of reliability of widelane AR; (3) for the narrowlane AR, the partial AR for a subset of ambiguities selected according to the successively increased elevation is applied. For fixing the scalar ambiguity, an error probability controllable rounding method is proposed. The established ionosphere-constrained model can be efficiently solved based on the sequential Kalman filter. It can be either reduced to some special models simply by adjusting the variances of ionospheric constraints, or extended with more parameters and constraints. The presented methodology is tested over seven baselines of around 100 km from USA CORS network. The results show that the new widelane AR scheme can obtain the 99.4 % successful fixing rate with 0.6 % failure rate; while the new rounding method of narrowlane AR can obtain the fix rate of 89 % with failure rate of 0.8 %. In summary, the AR reliability can be efficiently improved with rigorous controllable probability of incorrectly fixed ambiguities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ambiguity acceptance test is an important quality control procedure in high precision GNSS data processing. Although the ambiguity acceptance test methods have been extensively investigated, its threshold determine method is still not well understood. Currently, the threshold is determined with the empirical approach or the fixed failure rate (FF-) approach. The empirical approach is simple but lacking in theoretical basis, while the FF-approach is theoretical rigorous but computationally demanding. Hence, the key of the threshold determination problem is how to efficiently determine the threshold in a reasonable way. In this study, a new threshold determination method named threshold function method is proposed to reduce the complexity of the FF-approach. The threshold function method simplifies the FF-approach by a modeling procedure and an approximation procedure. The modeling procedure uses a rational function model to describe the relationship between the FF-difference test threshold and the integer least-squares (ILS) success rate. The approximation procedure replaces the ILS success rate with the easy-to-calculate integer bootstrapping (IB) success rate. Corresponding modeling error and approximation error are analysed with simulation data to avoid nuisance biases and unrealistic stochastic model impact. The results indicate the proposed method can greatly simplify the FF-approach without introducing significant modeling error. The threshold function method makes the fixed failure rate threshold determination method feasible for real-time applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ambiguity validation as an important procedure of integer ambiguity resolution is to test the correctness of the fixed integer ambiguity of phase measurements before being used for positioning computation. Most existing investigations on ambiguity validation focus on test statistic. How to determine the threshold more reasonably is less understood, although it is one of the most important topics in ambiguity validation. Currently, there are two threshold determination methods in the ambiguity validation procedure: the empirical approach and the fixed failure rate (FF-) approach. The empirical approach is simple but lacks of theoretical basis. The fixed failure rate approach has a rigorous probability theory basis, but it employs a more complicated procedure. This paper focuses on how to determine the threshold easily and reasonably. Both FF-ratio test and FF-difference test are investigated in this research and the extensive simulation results show that the FF-difference test can achieve comparable or even better performance than the well-known FF-ratio test. Another benefit of adopting the FF-difference test is that its threshold can be expressed as a function of integer least-squares (ILS) success rate with specified failure rate tolerance. Thus, a new threshold determination method named threshold function for the FF-difference test is proposed. The threshold function method preserves the fixed failure rate characteristic and is also easy-to-apply. The performance of the threshold function is validated with simulated data. The validation results show that with the threshold function method, the impact of the modelling error on the failure rate is less than 0.08%. Overall, the threshold function for the FF-difference test is a very promising threshold validation method and it makes the FF-approach applicable for the real-time GNSS positioning applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The original solution to the high failure rate of software development projects was the imposition of an engineering approach to software development, with processes aimed at providing a repeatable structure to maintain a consistency in the ‘production process’. Despite these attempts at addressing the crisis in software development, others have argued that the rigid processes of an engineering approach did not provide the solution. The Agile approach to software development strives to change how software is developed. It does this primarily by relying on empowered teams of developers who are trusted to manage the necessary tasks, and who accept that change is a necessary part of a development project. The use of, and interest in, Agile methods in software development projects has expanded greatly, yet this has been predominantly practitioner driven. There is a paucity of scientific research on Agile methods and how they are adopted and managed. This study aims at addressing this paucity by examining the adoption of Agile through a theoretical lens. The lens used in this research is that of double loop learning theory. The behaviours required in an Agile team are the same behaviours required in double loop learning; therefore, a transition to double loop learning is required for a successful Agile adoption. The theory of triple loop learning highlights that power factors (or power mechanisms in this research) can inhibit the attainment of double loop learning. This study identifies the negative behaviours - potential power mechanisms - that can inhibit the double loop learning inherent in an Agile adoption, to determine how the Agile processes and behaviours can create these power mechanisms, and how these power mechanisms impact on double loop learning and the Agile adoption. This is a critical realist study, which acknowledges that the real world is a complex one, hierarchically structured into layers. An a priori framework is created to represent these layers, which are categorised as: the Agile context, the power mechanisms, and double loop learning. The aim of the framework is to explain how the Agile processes and behaviours, through the teams of developers and project managers, can ultimately impact on the double loop learning behaviours required in an Agile adoption. Four case studies provide further refinement to the framework, with changes required due to observations which were often different to what existing literature would have predicted. The study concludes by explaining how the teams of developers, the individual developers, and the project managers, working with the Agile processes and required behaviours, can inhibit the double loop learning required in an Agile adoption. A solution is then proposed to mitigate these negative impacts. Additionally, two new research processes are introduced to add to the Information Systems research toolkit.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In a general equilibrium model. we show that the value of the equilibrium real exchange rate is affected by its own volatility. Risk averse exporters. that make their exporting decision before observing the realization of the real exchange rate. choose to export less the more volatile is the real exchange rate. Therefore the trude balance and the variance of the real exchange rate are negatively related. An increase in the volatility of the real exchange rate for instance deteriorates the trade balance and to restore equilibrium a real exchange rate depreciation has to take place. In the empirical part of the paper we use the traditional (unconditional) standard deviation of RER changes as our measure of RER volatility.We describe the behavior of the RER volatility for Brazil,Argentina and Mexico.Monthly data for the three countries are used. and also daily data for Bruzil. Interesting patterns of volatility could be associated to the nature of the several stabilization plans adopted in those countries and to changes in the exchange rate regimes .

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES The aim of this study was to describe the process to obtain Food and Drug Administration (FDA) approval for the expanded indication for treatment with the Resolute zotarolimus-eluting stent (R-ZES) (Medtronic, Inc., Santa Rosa, California) in patients with coronary artery disease and diabetes. BACKGROUND The R-ZES is the first drug-eluting stent specifically indicated in the United States for percutaneous coronary intervention in patients with diabetes. METHODS We pooled patient-level data for 5,130 patients from the RESOLUTE Global Clinical Program. A performance goal prospectively determined in conjunction with the FDA was established as a rate of target vessel failure at 12 months of 14.5%. In addition to the FDA pre-specified cohort of less complex patients with diabetes (n = 878), we evaluated outcomes of the R-ZES in all 1,535 patients with diabetes compared with all 3,595 patients without diabetes at 2 years. RESULTS The 12-month rate of target vessel failure in the pre-specified diabetic cohort was 7.8% (upper 95% confidence interval: 9.51%), significantly lower than the performance goal of 14.5% (p < 0.001). After 2 years, the cumulative incidence of target lesion failure in patients with noninsulin-treated diabetes was comparable to that of patients without diabetes (8.0% vs. 7.1%). The higher risk insulin-treated population demonstrated a significantly higher target lesion failure rate (13.7%). In the whole population, including complex patients, rates of stent thrombosis were not significantly different between patients with and without diabetes (1.2% vs. 0.8%). CONCLUSIONS The R-ZES is safe and effective in patients with diabetes. Long-term clinical data of patients with noninsulin-treated diabetes are equivalent to patients without diabetes. Patients with insulin-treated diabetes remain a higher risk subset. (The Medtronic RESOLUTE Clinical Trial; NCT00248079; Randomized, Two-arm, Non-inferiority Study Comparing Endeavor-Resolute Stent With Abbot Xience-V Stent [RESOLUTE-AC]; NCT00617084; The Medtronic RESOLUTE US Clinical Trial (R-US); NCT00726453; RESOLUTE International Registry: Evaluation of the Resolute Zotarolimus-Eluting Stent System in a 'Real-World' Patient Population [R-Int]; NCT00752128; RESOLUTE Japan-The Clinical Evaluation of the MDT-4107 Drug-Eluting Coronary Stent [RJ]; NCT00927940).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background The preservation of meniscal tissue is important to protect joint surfaces. Purpose We have an aggressive approach to meniscal repair, including repairing tears other than those classically suited to repair. Here we present the medium- to long-term outcome of meniscal repair (inside-out) in elite athletes. Study Design Case series; Level of evidence, 4. Methods Forty-two elite athletes underwent 45 meniscal repairs. All repairs were performed using an arthroscopically assisted inside-out technique. Eighty-three percent of these athletes had ACL reconstruction at the same time. Patients returned a completed questionnaire (including Lysholm and International Knee Documentation Committee [IKDC] scores). Mean follow-up was 8.5 years. Failure was defined by patients developing symptoms of joint line pain and/or locking or swelling requiring repeat arthroscopy and partial meniscectomy. Results The average Lysholm and subjective IKDC scores were 89.6 and 85.4, respectively. Eighty-one percent of patients returned to their main sport and most to a similar level at a mean time of 10.4 months after repair, reflecting the high level of ACL reconstruction in this group. We identified 11 definite failures, 10 medial and 1 lateral meniscus, that required excision; this represents a 24% failure rate. We identified 1 further patient who had possible failed repairs, giving a worst-case failure rate of 26.7% at a mean of 42 months after surgery. However, 7 of these failures were associated with a further injury. Therefore, the atraumatic failure rate was 11%. Age and size and location of the tears were not associated with a higher failure rate. Medial meniscal repairs were significantly more likely to fail than lateral meniscal repairs, with a failure rate of 36.4% and 5.6%, respectively (P < .05). Conclusion Meniscal repair and healing are possible, and most elite athletes can return to their preinjury level of activity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the advent of live cell imaging microscopy, new types of mathematical analyses and measurements are possible. Many of the real-time movies of cellular processes are visually very compelling, but elementary analysis of changes over time of quantities such as surface area and volume often show that there is more to the data than meets the eye. This unit outlines a geometric modeling methodology and applies it to tubulation of vesicles during endocytosis. Using these principles, it has been possible to build better qualitative and quantitative understandings of the systems observed, as well as to make predictions about quantities such as ligand or solute concentration, vesicle pH, and membrane trafficked. The purpose is to outline a methodology for analyzing real-time movies that has led to a greater appreciation of the changes that are occurring during the time frame of the real-time video microscopy and how additional quantitative measurements allow for further hypotheses to be generated and tested.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The material response and failure mechanism of unidirectional metal matrix composite under impulsive shear loading are investigated in this paper. Both experimental and analytical studies were performed. The shear strength of unidirectional C-f/A356.0 composite and A356.0 aluminum alloy at high strain rate were measured with a modified split Hopkinson torsional bar technique. The results indicated that the carbon fibers did not improve the shear strength of aluminum matrix if the fiber orientation aligned with the shear loading axis. The microscopic inspection of the fractured surface showed a multi-scale zigzag feature which implied a complicated shear failure mechanism in the composite. In addition to testing, the micromechanical stress field in the composite was analyzed by the generalized Eshelby equivalent method (GEEM). The influence of cracking in matrix on the micromechanical stress field was investigated as well. The results showed that the stress distribution in the composite is quite nonhomogeneous and very high shear stress concentrations are found in some regions in the matrix. The high shear stress concentration in the matrix induces tensile cracking at 45 degrees to the shear direction. This in turn aggravates the stress concentration at the fiber/matrix interface and finally leads to a catastrophic failure in the composite. From the correlation between the analysis and experimental results, the shear failure mechanism of unidirectional C-f/A356.0 composite can be elucidated qualitatively.