804 resultados para Distance-balanced graph
Resumo:
Traditional approaches to evaluate performance in hotels, have mainly used financial measures. Building on Speckbacher et al. (2003), this Work Project aims to design and propose a Balanced Scorecard Type II as a performance measurement/management system for the hospitality industry based on data collected at the Luxury Brand Hotels of Pestana Group. The main contribution is to better align the vision, strategy and financial and non-financial performance measures in this category of hotels, in particular those of Pestana Group, and by doing so, lead their managers to focus on what is really critical and, consequently improve the overall performance.
Resumo:
When assessing investment options, investors focus on the graphs of annual reports, despite lack of auditing. If poorly constructed, graphs distort perceptions and lead to inaccurate decisions. This study examines graph usage in all the companies listed on Euronext Lisbon in 2013. The findings suggest that graphs are common in the annual reports of Portuguese companies and that, while there is no evidence of Selectivity Distortion, both Measurement and Orientation Distortions are pervasive. The study recommends the auditing of financial graphs, and urges preparers and users of annual reports to be wary of the possibility of graph distortion.
Resumo:
As investors and other users of annual reports often focus their attention on graphs, it is important that they portray accurate and reliable information. However, previous studies show that graphs often distort information and mislead users. This study analyses graph usage in annual reports from the 52 most traded Norwegian companies. The findings suggest that Norwegian companies commonly use graphs, and that the graph distortions, presentational enhancement and measurement distortion, are present. No evidence of selectivity was found. This study recommends development of guidelines for graphical disclosure, and advises preparers and users of annual reports to be aware of misleading graphs.
Resumo:
BACKGROUND AND PURPOSE: Accurate placement of an external ventricular drain (EVD) for the treatment of hydrocephalus is of paramount importance for its functionality and in order to minimize morbidity and complications. The aim of this study was to compare two different drain insertion assistance tools with the traditional free-hand anatomical landmark method, and to measure efficacy, safety and precision. METHODS: Ten cadaver heads were prepared by opening large bone windows centered on Kocher's points on both sides. Nineteen physicians, divided in two groups (trainees and board certified neurosurgeons) performed EVD insertions. The target for the ventricular drain tip was the ipsilateral foramen of Monro. Each participant inserted the external ventricular catheter in three different ways: 1) free-hand by anatomical landmarks, 2) neuronavigation-assisted (NN), and 3) XperCT-guided (XCT). The number of ventricular hits and dangerous trajectories; time to proceed; radiation exposure of patients and physicians; distance of the catheter tip to target and size of deviations projected in the orthogonal plans were measured and compared. RESULTS: Insertion using XCT increased the probability of ventricular puncture from 69.2 to 90.2 % (p = 0.02). Non-assisted placements were significantly less precise (catheter tip to target distance 14.3 ± 7.4 mm versus 9.6 ± 7.2 mm, p = 0.0003). The insertion time to proceed increased from 3.04 ± 2.06 min. to 7.3 ± 3.6 min. (p < 0.001). The X-ray exposure for XCT was 32.23 mSv, but could be reduced to 13.9 mSv if patients were initially imaged in the hybrid-operating suite. No supplementary radiation exposure is needed for NN if patients are imaged according to a navigation protocol initially. CONCLUSION: This ex vivo study demonstrates a significantly improved accuracy and safety using either NN or XCT-assisted methods. Therefore, efforts should be undertaken to implement these new technologies into daily clinical practice. However, the accuracy versus urgency of an EVD placement has to be balanced, as the image-guided insertion technique will implicate a longer preparation time due to a specific image acquisition and trajectory planning.
Resumo:
The molecular characterization of balanced chromosomal rearrangements have always been of advantage in identifying disease-causing genes. Here, we describe the breakpoint mapping of a de novo balanced translocation t(7;12)(q11.22;q14.2) in a patient presenting with a failure to thrive associated with moderate mental retardation, facial anomalies, and chronic constipation. The localization of the breakpoints and the co-occurrence of Williams-Beuren syndrome and 12q14 microdeletion syndrome phenotypes suggested that the expression of some of the dosage-sensitive genes of these two segmental aneuploidies were modified in cells of the proposita. However, we were unable to identify chromosomes 7 and/or 12-mapping genes that showed disturbed expression in the lymphoblastoids of the proposita. This case showed that position-effect might operate in some tissues, but not in others. It also illustrates the overlap of phenotypes presented by patients with the recently described 12q14 structural rearrangements.
Resumo:
Predicting progeny performance from parental genetic divergence can potentially enhance the efficiency of supportive breeding programmes and facilitate risk assessment. Yet, experimental testing of the effects of breeding distance on offspring performance remains rare, especially in wild populations of vertebrates. Recent studies have demonstrated that embryos of salmonid fish are sensitive indicators of additive genetic variance for viability traits. We therefore used gametes of wild brown trout (Salmo trutta) from five genetically distinct populations of a river catchment in Switzerland, and used a full factorial design to produce over 2,000 embryos in 100 different crosses with varying genetic distances (FST range 0.005-0.035). Customized egg capsules allowed recording the survival of individual embryos until hatching under natural field conditions. Our breeding design enabled us to evaluate the role of the environment, of genetic and nongenetic parental contributions, and of interactions between these factors, on embryo viability. We found that embryo survival was strongly affected by maternal environmental (i.e. non-genetic) effects and by the microenvironment, i.e. by the location within the gravel. However, embryo survival was not predicted by population divergence, parental allelic dissimilarity, or heterozygosity, neither in the field nor under laboratory conditions. Our findings suggest that the genetic effects of inter-population hybridization within a genetically differentiated meta-population can be minor in comparison to environmental effects.
Resumo:
We examine entry mode choice and its consequences when a multinational enterprise (MNE) expands into an institutionally different country. We argue that discussions of entry mode should distinguish between informal (e.g., culture) and formal (e.g., laws) institutions, and should take into account not just the home country of the MNE and its distance to the focal host country, but the MNE's overall footprint and experience across the world in general, especially in countries with an institutional structure that is similar to that of the focal host country. Specifically, we argue that firms with experience in countries with different informal institutions will be more likely to enter via acquisitions than firms without such experience, that such experience will not matter as much in the case of formal institutions, and that such firms will exit more quickly when they enter via equity alliances than through full acquisitions. We also distinguish between balanced and unbalanced alliances and argue that balanced alliances will be more enduring, but only when the host country is culturally (not legally) different from the other countries where the MNE has experience. Our arguments suggest that entry mode should be conditioned on a firm's experience in other markets, and that intercountry differences in formal versus informal institutions have distinct influences on entry mode.
Resumo:
Comparer aussi, Cat. imp., liv. 106, f. 21 (Xin fa suan shu, 100 livres).Publiée par ordre impérial avant 1644, par Xu Guang qi et le P. Terenz (1576-1630).
Resumo:
The main objective of this research was to examine the relationship between surface electromyographic (SEMG) spike activity and force. The secondary objective was to determine to what extent subcutaneous tissue impacts the high frequency component of the signal, as well as, examining the relationship between measures of SEMG spike shape and their traditional time and frequency analogues. A total of96 participants (46 males and 50 females) ranging in age (18-35 years), generated three 5-second isometric step contractions at each force level of 40, 60, 80, and 100 percent of maximal voluntary contraction (MVC). The presentation of the contractions was balanced across subjects. The right arm of the subject was positioned in the sagittal plane, with the shoulder and elbow flexed to 90 degrees. The elbow rested on a support in a neutral position (mid pronation/mid supination) and placed within a wrist cuff, fastened below the styloid process. The wrist cuff was attached to a load cell (JR3 Inc., Woodland, CA) recording the force produced. Biceps brachii activity was monitored with a pair of Ag/AgCI recording electrodes (Grass F-E9, Astro-Med Inc., West Warwick, RI) placed in a bipolar configuration, with an interelectrode distance (lED) of 2cm distal to the motor point. Data analysis was performed on a I second window of data in the middle of the 5-second contraction. The results indicated that all spike shape measures exhibited significant (p < 0.01) differences as force increase~ from 40 to 100% MVC. The spike shape measures suggest that increased motor unit (MU) recruitment was responsible for increasing force up to 80% MVC. The results suggested that further increases in force relied on MU III synchronization. The results also revealed that the subcutaneous tissue (skin fold thickness) had no relationship (r = 0.02; P > 0.05) with the mean number of peaks per spike (MNPPS), which was the high frequency component of the signal. Mean spike amplitude (MSA) and mean spike frequency (MSF) were highly correlated with their traditional measures root mean square (RMS) and mean power frequency (MPF), respectively (r = 0.99; r = 0.97; P < 0.01).
Resumo:
This study had three purposes related to the effective implem,entation and practice of computer-mediated online distance education (C-MODE) at the elementary level: (a) To identify a preliminary framework of criteria 'or guidelines for effective implementation and practice, (b) to identify areas ofC-MODE for which criteria or guidelines of effectiveness have not yet been developed, and (c) to develop an implementation and practice criteria questionnaire based on a review of the distance education literature, and to use the questionnaire in an exploratory survey of elementary C-MODE practitioners. Using the survey instrument, the beliefs and attitudes of 16 elementary C'- MODE practitioners about what constitutes effective implementation and practice principles were investigated. Respondents, who included both administrators and instructors, provided information about themselves and the program in which they worked. They rated 101 individual criteria statenlents on a 5 point Likert scale with a \. point range that included the values: 1 (Strongly Disagree), 2 (Disagree), 3 (Neutral or Undecided), 4 (Agree), 5 (Strongly Agree). Respondents also provided qualitative data by commenting on the individual statements, or suggesting other statements they considered important. Eighty-two different statements or guidelines related to the successful implementation and practice of computer-mediated online education at the elementary level were endorsed. Response to a small number of statements differed significantly by gender and years of experience. A new area for investigation, namely, the role ofparents, which has received little attention in the online distance education literature, emerged from the findings. The study also identified a number of other areas within an elementary context where additional research is necessary. These included: (a) differences in the factors that determine learning in a distance education setting and traditional settings, (b) elementary students' ability to function in an online setting, (c) the role and workload of instructors, (d) the importance of effective, timely communication with students and parents, and (e) the use of a variety of media.
Resumo:
The hyper-star interconnection network was proposed in 2002 to overcome the drawbacks of the hypercube and its variations concerning the network cost, which is defined by the product of the degree and the diameter. Some properties of the graph such as connectivity, symmetry properties, embedding properties have been studied by other researchers, routing and broadcasting algorithms have also been designed. This thesis studies the hyper-star graph from both the topological and algorithmic point of view. For the topological properties, we try to establish relationships between hyper-star graphs with other known graphs. We also give a formal equation for the surface area of the graph. Another topological property we are interested in is the Hamiltonicity problem of this graph. For the algorithms, we design an all-port broadcasting algorithm and a single-port neighbourhood broadcasting algorithm for the regular form of the hyper-star graphs. These algorithms are both optimal time-wise. Furthermore, we prove that the folded hyper-star, a variation of the hyper-star, to be maixmally fault-tolerant.
Resumo:
Abstract: Root and root finding are concepts familiar to most branches of mathematics. In graph theory, H is a square root of G and G is the square of H if two vertices x,y have an edge in G if and only if x,y are of distance at most two in H. Graph square is a basic operation with a number of results about its properties in the literature. We study the characterization and recognition problems of graph powers. There are algorithmic and computational approaches to answer the decision problem of whether a given graph is a certain power of any graph. There are polynomial time algorithms to solve this problem for square of graphs with girth at least six while the NP-completeness is proven for square of graphs with girth at most four. The girth-parameterized problem of root fining has been open in the case of square of graphs with girth five. We settle the conjecture that recognition of square of graphs with girth 5 is NP-complete. This result is providing the complete dichotomy theorem for square root finding problem.
Resumo:
Complex networks can arise naturally and spontaneously from all things that act as a part of a larger system. From the patterns of socialization between people to the way biological systems organize themselves, complex networks are ubiquitous, but are currently poorly understood. A number of algorithms, designed by humans, have been proposed to describe the organizational behaviour of real-world networks. Consequently, breakthroughs in genetics, medicine, epidemiology, neuroscience, telecommunications and the social sciences have recently resulted. The algorithms, called graph models, represent significant human effort. Deriving accurate graph models is non-trivial, time-intensive, challenging and may only yield useful results for very specific phenomena. An automated approach can greatly reduce the human effort required and if effective, provide a valuable tool for understanding the large decentralized systems of interrelated things around us. To the best of the author's knowledge this thesis proposes the first method for the automatic inference of graph models for complex networks with varied properties, with and without community structure. Furthermore, to the best of the author's knowledge it is the first application of genetic programming for the automatic inference of graph models. The system and methodology was tested against benchmark data, and was shown to be capable of reproducing close approximations to well-known algorithms designed by humans. Furthermore, when used to infer a model for real biological data the resulting model was more representative than models currently used in the literature.