942 resultados para Critical clearing time


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analysis of stratigraphic terminology and classification, shows that time-related stratigraphic units, which by definition have a global extent, are the concern of international cornrnissions and committees of the intemational Union of Geological Sciences (IUGS) . In contrast, lithostratigraphic, and other closely related units, are regional in extent and are catalogued in the International Stratigraphic Lexicon (ISL), the last volume of which, was published in 1987. Tlie intemational Commission on Stratigraphy (ICS) is currently attempting to revitalize the publication of ISL, given that the information contained in published volumes has never been updated, and that there has been a significant increase in stratigraphic research in recent decades. The proliferation of named units in the South Pyrenean and Ebro Basin Paleogene is evaluated to illustrate the extent of the problem. Moreover, new approaches to stratigraphic analysis have led to the naming of genetic units according to similar guidelines followed in the naming of descnptive or lithostratigraphic units. This has led to considerable confusion. The proposal to revitalize the ISL is accepted as part of the solution, that should also include the publication of critica1 catalogues, and the creation of norms for genetic unit terminology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: Randomized clinical trials that enroll patients in critical or emergency care (acute care) setting are challenging because of narrow time windows for recruitment and the inability of many patients to provide informed consent. To assess the extent that recruitment challenges lead to randomized clinical trial discontinuation, we compared the discontinuation of acute care and nonacute care randomized clinical trials. DESIGN: Retrospective cohort of 894 randomized clinical trials approved by six institutional review boards in Switzerland, Germany, and Canada between 2000 and 2003. SETTING: Randomized clinical trials involving patients in an acute or nonacute care setting. SUBJECTS AND INTERVENTIONS: We recorded trial characteristics, self-reported trial discontinuation, and self-reported reasons for discontinuation from protocols, corresponding publications, institutional review board files, and a survey of investigators. MEASUREMENTS AND MAIN RESULTS: Of 894 randomized clinical trials, 64 (7%) were acute care randomized clinical trials (29 critical care and 35 emergency care). Compared with the 830 nonacute care randomized clinical trials, acute care randomized clinical trials were more frequently discontinued (28 of 64, 44% vs 221 of 830, 27%; p = 0.004). Slow recruitment was the most frequent reason for discontinuation, both in acute care (13 of 64, 20%) and in nonacute care randomized clinical trials (7 of 64, 11%). Logistic regression analyses suggested the acute care setting as an independent risk factor for randomized clinical trial discontinuation specifically as a result of slow recruitment (odds ratio, 4.00; 95% CI, 1.72-9.31) after adjusting for other established risk factors, including nonindustry sponsorship and small sample size. CONCLUSIONS: Acute care randomized clinical trials are more vulnerable to premature discontinuation than nonacute care randomized clinical trials and have an approximately four-fold higher risk of discontinuation due to slow recruitment. These results highlight the need for strategies to reliably prevent and resolve slow patient recruitment in randomized clinical trials conducted in the critical and emergency care setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: In the context of the European Surveillance of Congenital Anomalies (EUROCAT) surveillance response to the 2009 influenza pandemic, we sought to establish whether there was a detectable increase of congenital anomaly prevalence among pregnancies exposed to influenza seasons in general, and whether any increase was greater during the 2009 pandemic than during other seasons. METHODS: We performed an ecologic time series analysis based on 26,967 pregnancies with nonchromosomal congenital anomaly conceived from January 2007 to March 2011, reported by 15 EUROCAT registries. Analysis was performed for EUROCAT-defined anomaly subgroups, divided by whether there was a prior hypothesis of association with influenza. Influenza season exposure was based on World Health Organization data. Prevalence rate ratios were calculated comparing pregnancies exposed to influenza season during the congenital anomaly-specific critical period for embryo-fetal development to nonexposed pregnancies. RESULTS: There was no evidence for an increased overall prevalence of congenital anomalies among pregnancies exposed to influenza season. We detected an increased prevalence of ventricular septal defect and tricuspid atresia and stenosis during pandemic influenza season 2009, but not during 2007-2011 influenza seasons. For congenital anomalies, where there was no prior hypothesis, the prevalence of tetralogy of Fallot was strongly reduced during influenza seasons. CONCLUSIONS: Our data do not suggest an overall association of pandemic or seasonal influenza with congenital anomaly prevalence. One interpretation is that apparent influenza effects found in previous individual-based studies were confounded by or interacting with other risk factors. The associations of heart anomalies with pandemic influenza could be strain specific.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To perform a critical review focusing on the applicability in clinical daily practice of data from three randomized controlled trials (RCTs): SWOG 8794, EORTC 22911, and ARO/AUO 96-02. METHODS AND MATERIALS: An analytical framework, based on the identified population, interventions, comparators, and outcomes (PICO) was used to refine the search of the evidence from the three large randomized trials regarding the use of radiation therapy after prostatectomy as adjuvant therapy (ART). RESULTS: With regard to the inclusion criteria: (1) POPULATION: in the time since they were designed, in two among three trial (SWOG 8794 and EORTC 22911) patients had a detectable PSA at the time of randomization, thus representing de facto a substantial proportion of patients who eventually received salvage RT (SRT) at non-normalised PSA levels rather than ART. (2) INTERVENTIONS: although all the trials showed the benefit of postoperative ART compared to a wait-and-see approach, the dose herein employed would be now considered inadequate; (3) COMPARATORS: the comparison arm in all the 3 RCTs was an uncontrolled observation arm, where patients who subsequently developed biochemical failure were treated in various ways, with up to half of them receiving SRT at PSA well above 1ng/mL, a level that would be now deemed inappropriate; (4) OUTCOMES: only in one trial (SWOG 8794) ART was found to significantly improve overall survival compared to observation, with a ten-year overall survival rate of 74% vs. 66%, although this might be partly the result of imbalanced risk factors due to competing event risk stratification. CONCLUSIONS: ART has a high level of evidence due to three RCTs with at least 10-year follow-up recording a benefit in biochemical PFS, but its penetrance in present daily clinics should be reconsidered. While the benefit of ART or SRT is eagerly expected from ongoing randomized trials, a dynamic risk-stratified approach should drive the decisions making process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The illicit drug cutting represents a complex problem that requires the sharing of knowledge from addiction studies, toxicology, criminology and criminalistics. Therefore, cutting is not well known by the forensic community. Thus, this review aims at deciphering the different aspects of cutting, by gathering information mainly from criminology and criminalistics. It tackles essentially specificities of cocaine and heroin cutting. The article presents the detected cutting agents (adulterants and diluents), their evolution in time and space and the analytical methodology implemented by forensic laboratories. Furthermore, it discusses when, in the history of the illicit drug, cutting may take place. Moreover, researches studying how much cutting occurs in the country of destination are analysed. Lastly, the reasons for cutting are addressed. According to the literature, adulterants are added during production of the illicit drug or at a relatively high level of its distribution chain (e.g. before the product arrives in the country of destination or just after its importation in the latter). Their addition seems hardly justified by the only desire to increase profits or to harm consumers' health. Instead, adulteration would be performed to enhance or to mimic the illicit drug effects or to facilitate administration of the drug. Nowadays, caffeine, diltiazem, hydroxyzine, levamisole, lidocaïne and phenacetin are frequently detected in cocaine specimens, while paracetamol and caffeine are almost exclusively identified in heroin specimens. This may reveal differences in the respective structures of production and/or distribution of cocaine and heroin. As the relevant information about cutting is spread across different scientific fields, a close collaboration should be set up to collect essential and unified data to improve knowledge and provide information for monitoring, control and harm reduction purposes. More research, on several areas of investigation, should be carried out to gather relevant information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Raw measurement data does not always immediately convey useful information, but applying mathematical statistical analysis tools into measurement data can improve the situation. Data analysis can offer benefits like acquiring meaningful insight from the dataset, basing critical decisions on the findings, and ruling out human bias through proper statistical treatment. In this thesis we analyze data from an industrial mineral processing plant with the aim of studying the possibility of forecasting the quality of the final product, given by one variable, with a model based on the other variables. For the study mathematical tools like Qlucore Omics Explorer (QOE) and Sparse Bayesian regression (SB) are used. Later on, linear regression is used to build a model based on a subset of variables that seem to have most significant weights in the SB model. The results obtained from QOE show that the variable representing the desired final product does not correlate with other variables. For SB and linear regression, the results show that both SB and linear regression models built on 1-day averaged data seriously underestimate the variance of true data, whereas the two models built on 1-month averaged data are reliable and able to explain a larger proportion of variability in the available data, making them suitable for prediction purposes. However, it is concluded that no single model can fit well the whole available dataset and therefore, it is proposed for future work to make piecewise non linear regression models if the same available dataset is used, or the plant to provide another dataset that should be collected in a more systematic fashion than the present data for further analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to evaluate the effect of the birth hospital and the time of birth on mortality and the long-term outcome of Finnish very low birth weight (VLBW) or very low gestational age (VLGA) infants. This study included all Finnish VLBW/VLGA infants born at <32 gestational weeks or with a birth weight of ≤1500g, and controls born full-term and healthy. In the first part of the study, the mortality of VLBW/VLGA infants born in 2000–2003 was studied. The second part of the study consisted of a five-year follow-up of VLBW/VLGA infants born in 2001–2002. The study was performed using data from parental questionnaires and several registers. The one-year mortality rate was 11% for live-born VLBW/VLGA infants, 22% for live-born and stillborn VLBW/VLGA infants, and 0% for the controls. In live-born and in all (including stillbirths) VLBW/VLGA infants, the adjusted mortality was lower among those born in level III hospitals compared with level II hospitals. Mortality rates of live-born VLBW/VLGA infants differed according to the university hospital district where the birth hospital was located, but there were no differences in mortality between the districts when stillborn infants were included. There was a trend towards lower mortality rates in VLBW/VLGA infants born during office hours compared with those born outside office hours (night time, weekends, and public holidays). When stillborn infants were included, this difference according to the time of birth was significant. Among five-year-old VLBW/VLGA children, morbidity, use of health care resources, and problems in behaviour and development were more common in comparison with the controls. The health-related quality of life of the surviving VLBW/VLGA children was good but, statistically, it was significantly lower than among the controls. The median and the mean number of quality-adjusted life-years were 4.6 and 3.6 out of a maximum five years for all VLBW/VLGA children. For the controls, the median was 4.8 and the mean was 4.9. Morbidity rates, the use of health care resources, and the mean quality-adjusted life-years differed for VLBW/VLGA children according to the university hospital district of birth. However, the time of birth, the birth hospital level or university hospital district were not associated with the health-related quality of life, nor with behavioural and developmental scores of the survivors at the age of five years. In conclusion, the decreased mortality in level III hospitals was not gained at the expense of long-term problems. The results indicate that VLBW/VLGA deliveries should be centralized to level III hospitals and the regional differences in the treatment practices should further be clarified. A long-term follow-up on the outcome of VLBW/VLGA infants is important in order to recognize the critical periods of care and to optimise the care. In the future, quality-adjusted life-years can be used as a uniform measure for comparing the effectiveness of care between VLBW/VLGA infants and different patient groups

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Crossroads, crucibles and refuges are three words that may describe natural coastal lagoon environments. The words refer to the complex mix of marine and terrestrial influences, prolonged dilution due to the semi-enclosed nature and the function of a habitat for highly diverse plant and animal communities, some of which are endangered. To attain a realistic picture of the present situation, high vulnerability to anthropogenic impact should be added to the description. As the sea floor in coastal lagoons is usually entirely photic, macrophyte primary production is accentuated compared with open sea environments. There is, however, a lack of proper knowledge on the importance of vegetation for the general functioning of coastal lagoon ecosystems. The aim of this thesis is to assess the role of macrophyte diversity, cover and species identity over temporal and spatial scales for lagoon functions, and to determine which steering factors primarily restrict the qualitative and quantitative composition of vegetation in coastal lagoons. The results are linked to patterns of related trophic levels and the indicative potential of vegetation for assessment of general conditions in coastal lagoons is evaluated. This thesis includes five field studies conducted in flads and glo-flads in the brackish water northern Baltic Sea. Flads and glo-flads are defined as a Baltic variety of coastal lagoons, which due to an inlet threshold and post-glacial landuplift slowly will be isolated from the open sea. This process shrinks inlet size, increases exposure and water retention, and is called habitat isolation. The studied coastal lagoons are situated in the archipelago areas of the eastern coast of Sweden, the Åland Islands and the south-west mainland of Finland, where land-uplift amounts to ca. 5 mm/ per year. Out of 400 evaluated sites, a total of 70 lagoons varying in inlet size, archipelago position and anthropogenic influence to cover for essential environmental variation were chosen for further inventory. Vegetation composition, cover and richness were measured together with several hydrographic and morphometric variables in the lagoons both seasonally and inter-annually to cover for general regional, local and temporal patterns influencing lagoon and vegetation development. On smaller species-level scale, the effects of macrophyte species identity and richness for the fish habitat function were studied by examining the influence of plant interaction on juvenile fish diversity. Thus, the active election of plant monoand polycultures by fish and the diversity of fish in the respective culture were examined and related to plant height and water depth. The lagoons and vegetation composition were found to experience a regime shift initiated by increased habitat isolation along with land-uplift. Vegetation composition altered, richness decreased and cover increased forming a less isolated and more isolated regime, named the vascular plant regime and charophyte regime, respectively according to the dominant vegetation. As total phosphorus in the water, turbidity and the impact of regional influences decreased in parallel, the dominance of charophytes and increasing cover seemed to buffer and stabilize conditions in the charophyte regime and indicated an increased functional role of vegetation for the lagoon ecosystem. The regime pattern was unaffected by geographical differences, while strong anthropogenic impact seemed to distort the pattern due to loss of especially Chara tomentosa L. in the charophyte regime. The regimes were further found unperturbed by short-time temporal fluctuations. In fact the seasonal and inter-annual dynamics reinforced the functional difference between the regimes by the increasing role of vegetation along habitat isolation and the resemblance to lake environments for the charophyte regime. For instance, greater total phosphorus and chlorophyll a concentrations in the water in the beginning of the season in the charophyte regime compared with the vascular plant regime presented a steeper reduction to even lower values than in the vascular plant regime along the season. Despite a regional importance and positive relationship of macrophyte diversity in relation to trophic diversity, species identity was underlined in the results of this thesis, especially with decreasing spatial scale. This result was supported partly by the increased role of charophytes in the functioning of the charophyte regime, but even more explicitly by the species-specific preference of juvenile fish for tall macrophyte monocultures. On a smaller species-level scale, tall plant species in monoculture seemed to be able to increase their length, indicating that negative selection forms preferred habitat structures, which increase fish diversity. This negative relationship between plant and fish diversity suggest a shift in diversity patterns among trohic levels on smaller scale. Thus, as diversity patterns seem complex and diverge among spatial scales, it might be ambiguous to extend the understanding of diversity relationships from one trophic level to the other. All together, the regime shift described here presents similarities to the regime development in marine lagoon environments and shallow lakes subjected to nutrient enrichment. However, due to nutrient buffering by vegetation with increased isolation and water retention as a consequence of the inlet threshold, the development seems opposite to the course along an eutrophication gradient described in marine lagoons lacking an inlet threshold, where the role of vegetation decreases. Thus, the results imply devastating consequences of inlet dredging (decreasing isolation) in terms of vegetation loss and nutrient release, and call for increased conservational supervision. Especially the red listed charophytes would suffer negatively from such interference and the consequences are likely to also deteriorate juvenile fish production. The fact that a new species to Finland, Chara connivens Salzm. Ex. Braun 1835 was discovered during this study further indicates a potential of the lagoons serving as refuges for rare species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Maritime transports are very essential for Finland as over 80% of the foreign trade in the country is seaborne and possibilities to carry out these transports by are limited. Any disruption in maritime transports has negative consequences to many sectors in the Finnish economy. Maritime transport thus represents critical infrastructure for Finland. This report focuses on the importance of maritime transports on security of supply in Finland and for the so called critical industries in particular. The report summarizes the results of the Work Package 2 of the research project STOCA – “Study of cargo flows in the Gulf of Finland in emergency situations”. The aim of the research was to analyze the cargo flows and infrastructure that are vital for maintaining security of supply in Finland, as well as the consequences of disruptions in the maritime traffic for the Finnish critical industries and for the Finnish society. In the report we give a presentation of the infrastructure and transport routes which are critical for maintaining security of supply in Finland. We discuss import dependency of the critical industries, and the importance of the Gulf of Finland ports for Finland. We assess vulnerabilities associated with the critical material flows of the critical industries, and possibilities for alternative routings in case either one or several of the ports in Finland would be closed. As a concrete example of a transport disruption we analyze the consequences of the Finnish stevedore strike at public ports (4.3.–19.3.2010). The strike stopped approximately 80% of the Finnish foreign trade. As a result of the strike Finnish companies could not export their products and/or import raw materials, components and spare parts, or other essential supplies. We carried out personal interviews with representatives of the companies in Finnish critical industries to find out about the problems caused by the strike, how companies carried out they transports and how they managed to continue their operations during the strike. Discussions with the representatives of the companies gave us very practical insights about companies’ preparedness towards transport disruptions in general. Companies in the modern world are very vulnerable to transport disruptions because companies regardless of industries have tried to improve their performance by optimizing their resources and e.g. by reducing their inventory levels. At the same time they have become more and more dependent on continuous transports. Most companies involved in foreign trade have global operations and global supply chains, so any disruption anywhere in the world can have an impact on the operations of the company causing considerable financial loss. The volcanic eruption in Iceland in April 2010 stopping air traffic in the whole Northern Europe and most recently the earth quake causing a tsunami in Japan in March 2011 are examples of severe disruptions causing considerable negative impacts to companies’ supply chains. Even though the Finnish stevedore strike was a minor disruption compared to the natural catastrophes mentioned above, it showed the companies’ vulnerability to transport disruptions very concretely. The Finnish stevedore strike gave a concrete learning experience of the importance of preventive planning for all Finnish companies: it made them re-think their practical preparedness towards transport risks and how they can continue with their daily operations despite the problems. Many companies realized they need to adapt their long-term countermeasures against transport disruptions. During the strike companies did various actions to secure their supply chains. The companies raised their inventory levels before the strike began, they re-scheduled or postponed their deliveries, shifted customer orders between production plants among their company’s production network or in the extreme case bought finished products from their competitor to fulfil their customers’ order. Our results also show that possibilities to prepare against transport disruptions differ between industries. The Finnish society as a whole is very dependent on imports of energy, various raw materials and other supplies needed by the different industries. For many of the Finnish companies in the export industries and e.g. in energy production maritime transport is the only transport mode the companies can use due to large volumes of materials transported or due to other characteristics of the goods. Therefore maritime transport cannot be replaced by any other transport mode. In addition, a significant amount of transports are concentrated in certain ports. From a security of supply perspective attention should be paid to finding ways to decrease import dependency and ensuring that companies in the critical industries can ensure the continuity of their operations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Intensive and critical care nursing is a speciality in its own right and with its own nature within the nursing profession. This speciality poses its own demands for nursing competencies. Intensive and critical care nursing is focused on severely ill patients and their significant others. The patients are comprehensively cared for, constantly monitored and their vital functions are sustained artificially. The main goal is to win time to cure the cause of the patient’s situation or illness. The purpose of this empirical study was i) to describe and define competence and competence requirements in intensive and critical care nursing, ii) to develop a basic measurement scale for competence assessment in intensive and critical care nursing for graduating nursing students, and iii) to describe and evaluate graduating nursing students’ basic competence in intensive and critical care nursing by seeking the reference basis of self-evaluated basic competence in intensive and critical care nursing from ICU nurses. However, the main focus of this study was on the outcomes of nursing education in this nursing speciality. The study was carried out in different phases: basic exploration of competence (phase 1 and 2), instrumentation of competence (phase 3) and evaluation of competence (phase 4). Phase 1 (n=130) evaluated graduating nursing students’ basic biological and physiological knowledge and skills for working in intensive and critical care with Basic Knowledge Assessment Tool version 5 (BKAT-5, Toth 2012). Phase 2 focused on defining competence in intensive and critical care nursing with the help of literature review (n=45 empirical studies) as well as competence requirements in intensive and critical care nursing with the help of experts (n=45 experts) in a Delphi study. In phase 3 the scale Intensive and Critical Care Nursing Competence Scale (ICCN-CS) was developed and tested twice (pilot test 1: n=18 students and n=12 nurses; pilot test 2: n=56 students and n=54 nurses). Finally, in phase 4, graduating nursing students’ competence was evaluated with ICCN-CS and BKAT version 7 (Toth 2012). In order to develop a valid assessment scale of competence for graduating nursing students and to evaluate and establish the competence of graduating nursing students, empirical data were retrieved at the same time from both graduating nursing students (n=139) and ICU nurses (n=431). Competence can be divided into clinical and general professional competence. It can be defined as a specific knowledge base, skill base, attitude and value base and experience base of nursing and the personal base of an intensive and critical care nurse. Personal base was excluded in this self-evaluation based scale. The ICCN-CS-1 consists of 144 items (6 sum variables). Finally, it became evident that the experience base of competence is not a suitable sum variable in holistic intensive and critical care competence scale for graduating nursing students because of their minor experience in this special nursing area. ICCN-CS-1 is a reliable and tolerably valid scale for use among graduating nursing students and ICU nurses Among students, basic competence of intensive and critical care nursing was self-rated as good by 69%, as excellent by 25% and as moderate by 6%. However, graduating nursing students’ basic biological and physiological knowledge and skills for working in intensive and critical care were poor. The students rated their clinical and professional competence as good, and their knowledge base and skill base as moderate. They gave slightly higher ratings for their knowledge base than skill base. Differences in basic competence emerged between graduating nursing students and ICU nurses. The students’ self-ratings of both their basic competence and clinical and professional competence were significantly lower than the nurses’ ratings. The students’ self-ratings of their knowledge and skill base were also statistically significantly lower than nurses’ ratings. However, both groups reported the same attitude and value base, which was excellent. The strongest factor explaining students’ conception of their competence was their experience of autonomy in nursing. Conclusions: Competence in intensive and critical care nursing is a multidimensional concept. Basic competence in intensive and critical care nursing can be measured with self-evaluation based scale but alongside should be used an objective evaluation method. Graduating nursing students’ basic competence in intensive and critical care nursing is good but their knowledge and skill base are moderate. Especially the biological and physiological knowledge base is poor. Therefore in future in intensive and critical care nursing education should be focused on both strengthening students’ biological and physiological knowledge base and on strengthening their overall skill base. Practical implications are presented for nursing education, practice and administration. In future, research should focus on education methods and contents, mentoring of clinical practice and orientation programmes as well as further development of the scale.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To evaluate the effectiveness and safety of correction of pectus excavatum by the Nuss technique based on the available scientific evidence.Methods: We conducted an evidence synthesis following systematic processes of search, selection, extraction and critical appraisal. Outcomes were classified by importance and had their quality assessed by the Grading of Recommendations Assessment, Development and Evaluation (GRADE).Results: The process of selection of items led to the inclusion of only one systematic review, which synthesized the results of nine observational studies comparing the Nuss and Ravitch procedures. The evidence found was rated as poor and very poor quality. The Nuss procedure has increased the incidence of hemothorax (RR = 5.15; 95% CI: 1.07; 24.89), pneumothorax (RR = 5.26; 95% CI: 1.55; 17.92) and the need for reintervention (RR = 4.88; 95% CI: 2.41; 9.88) when compared to the Ravitch. There was no statistical difference between the two procedures in outcomes: general complications, blood transfusion, hospital stay and time to ambulation. The Nuss operation was faster than the Ravitch (mean difference [MD] = -69.94 minutes, 95% CI: -139.04, -0.83).Conclusion: In the absence of well-designed prospective studies to clarify the evidence, especially in terms of aesthetics and quality of life, surgical indication should be individualized and the choice of the technique based on patient preference and experience of the team.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Consumers increasingly demand convenience when dealing with companies and therefore it is important to provide professional, diverse and speedy service via customer’s preferred communication channel. These interactions between the customer service and customer have a critical role in customer’s future purchasing decisions. Those customers who don't receive satisfactory customer service are willing to do business with another company that charges more but offers better customer service. This study identifies the critical success factors for the customer service in order to improve the customer service according to the company’s mission and meet customer expectations. Case study is used as a research method and data is collected via observation, archival records and interviews during a time span of fourteen months. The analysis suggests three critical success factors: voice support, scalable and flexible customer service and customer service champions. The study further analyzes the improvement measures according to the critical success factors concluding the Business Process Outsourcing to be the most proper to proceed with. As a conclusion of the study, critical success factors enable achieving the goals of the customer service and align operations according to the company’s mission. Business Process Outsourcing plays important role in improving the customer service by allowing fast expansion of new service offering and obtaining specialized workforce.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, computer-based systems tend to become more complex and control increasingly critical functions affecting different areas of human activities. Failures of such systems might result in loss of human lives as well as significant damage to the environment. Therefore, their safety needs to be ensured. However, the development of safety-critical systems is not a trivial exercise. Hence, to preclude design faults and guarantee the desired behaviour, different industrial standards prescribe the use of rigorous techniques for development and verification of such systems. The more critical the system is, the more rigorous approach should be undertaken. To ensure safety of a critical computer-based system, satisfaction of the safety requirements imposed on this system should be demonstrated. This task involves a number of activities. In particular, a set of the safety requirements is usually derived by conducting various safety analysis techniques. Strong assurance that the system satisfies the safety requirements can be provided by formal methods, i.e., mathematically-based techniques. At the same time, the evidence that the system under consideration meets the imposed safety requirements might be demonstrated by constructing safety cases. However, the overall safety assurance process of critical computerbased systems remains insufficiently defined due to the following reasons. Firstly, there are semantic differences between safety requirements and formal models. Informally represented safety requirements should be translated into the underlying formal language to enable further veri cation. Secondly, the development of formal models of complex systems can be labour-intensive and time consuming. Thirdly, there are only a few well-defined methods for integration of formal verification results into safety cases. This thesis proposes an integrated approach to the rigorous development and verification of safety-critical systems that (1) facilitates elicitation of safety requirements and their incorporation into formal models, (2) simplifies formal modelling and verification by proposing specification and refinement patterns, and (3) assists in the construction of safety cases from the artefacts generated by formal reasoning. Our chosen formal framework is Event-B. It allows us to tackle the complexity of safety-critical systems as well as to structure safety requirements by applying abstraction and stepwise refinement. The Rodin platform, a tool supporting Event-B, assists in automatic model transformations and proof-based verification of the desired system properties. The proposed approach has been validated by several case studies from different application domains.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When the offset of a visual stimulus (GAP condition) precedes the onset of a target, saccadic reaction times are reduced in relation to the condition with no offset (overlap condition) - the GAP effect. However, the existence of the GAP effect for manual responses is still controversial. In two experiments using both simple (Experiment 1, N = 18) and choice key-press procedures (Experiment 2, N = 12), we looked for the GAP effect in manual responses and investigated possible contextual influences on it. Participants were asked to respond to the imperative stimulus that would occur under different experimental contexts, created by varying the array of warning-stimulus intervals (0, 300 and 1000 ms) and conditions (GAP and overlap): i) intervals and conditions were randomized throughout the experiment; ii) conditions were run in different blocks and intervals were randomized; iii) intervals were run in different blocks and conditions were randomized. Our data showed that no GAP effect was obtained for any manipulation. The predictability of stimulus occurrence produced the strongest influence on response latencies. In Experiment 1, simple manual responses were shorter when the intervals were blocked (247 ms, P < 0.001) in relation to the other two contexts (274 and 279 ms). Despite the use of choice key-press procedures, Experiment 2 produced a similar pattern of results. A discussion addressing the critical conditions to obtain the GAP effect for distinct motor responses is presented. In short, our data stress the relevance of the temporal allocation of attention for behavioral performance.