822 resultados para Consumerization of IT


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Observations of the Sun’s corona during the space era have led to a picture of relatively constant, but cyclically varying solar output and structure. Longer-term, more indirect measurements, such as from 10Be, coupled by other albeit less reliable contemporaneous reports, however, suggest periods of significant departure from this standard. The Maunder Minimum was one such epoch where: (1) sunspots effectively disappeared for long intervals during a 70 yr period; (2) eclipse observations suggested the distinct lack of a visible K-corona but possible appearance of the F-corona; (3) reports of aurora were notably reduced; and (4) cosmic ray intensities at Earth were inferred to be substantially higher. Using a global thermodynamic MHD model, we have constructed a range of possible coronal configurations for the Maunder Minimum period and compared their predictions with these limited observational constraints. We conclude that the most likely state of the corona during—at least—the later portion of the Maunder Minimum was not merely that of the 2008/2009 solar minimum, as has been suggested recently, but rather a state devoid of any large-scale structure, driven by a photospheric field composed of only ephemeral regions, and likely substantially reduced in strength. Moreover, we suggest that the Sun evolved from a 2008/2009-like configuration at the start of the Maunder Minimum toward an ephemeral-only configuration by the end of it, supporting a prediction that we may be on the cusp of a new grand solar minimum.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

There are some long-established biases in atmospheric models that originate from the representation of tropical convection. Previously, it has been difficult to separate cause and effect because errors are often the result of a number of interacting biases. Recently, researchers have gained the ability to run multiyear global climate model simulations with grid spacings small enough to switch the convective parameterization off, which permits the convection to develop explicitly. There are clear improvements to the initiation of convective storms and the diurnal cycle of rainfall in the convection-permitting simulations, which enables a new process-study approach to model bias identification. In this study, multiyear global atmosphere-only climate simulations with and without convective parameterization are undertaken with the Met Office Unified Model and are analyzed over the Maritime Continent region, where convergence from sea-breeze circulations is key for convection initiation. The analysis shows that, although the simulation with parameterized convection is able to reproduce the key rain-forming sea-breeze circulation, the parameterization is not able to respond realistically to the circulation. A feedback of errors also occurs: the convective parameterization causes rain to fall in the early morning, which cools and wets the boundary layer, reducing the land–sea temperature contrast and weakening the sea breeze. This is, however, an effect of the convective bias, rather than a cause of it. Improvements to how and when convection schemes trigger convection will improve both the timing and location of tropical rainfall and representation of sea-breeze circulations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Food industry is critical to any nation’s health and well-being; it is also critical to the economic health of a nation, since it can typically constitute over a fifth of the nation’s manufacturing GDP. Food Engineering is a discipline that ought to be at the heart of the food industry. Unfortunately, this discipline is not playing its rightful role today: engineering has been relegated to play the role of a service provider to the food industry, instead of it being a strategic driver for the very growth of the industry. This paper hypothesises that food engineering discipline, today, seems to be continuing the way it was in the last century, and has not risen to the challenges that it really faces. This paper therefore categorises the challenges as those being posed by: 1. Business dynamics, 2. Market forces, 3. Manufacturing environment and 4. Environmental Considerations, and finds the current scope and subject-knowledge competencies of food engineering to be inadequate in meeting these challenges. The paper identifies: a) health, b) environment and c) security as the three key drivers of the discipline, and proposes a new definition of food engineering. This definition requires food engineering to have a broader science base which includes biophysical, biochemical and health sciences, in addition to engineering sciences. This definition, in turn, leads to the discipline acquiring a new set of subject-knowledge competencies that is fit-for-purpose for this day and age, and hopefully for the foreseeable future. The possibility of this approach leading to the development of a higher education program in food engineering is demonstrated by adopting a theme based curriculum development with five core themes, supplemented by appropriate enabling and knowledge integrating courses. At the heart of this theme based approach is an attempt to combine engineering of process and product in a purposeful way, termed here as Food Product Realisation Engineering. Finally, the paper also recommends future development of two possible niche specialisation programs in Nutrition and Functional Food Engineering and Gastronomic Engineering. It is hoped that this reconceptualization of the discipline will not only make it more purposeful for the food industry, but it will also make the subject more intellectually challenging and attract bright young minds to the discipline.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

fit the context of normalized variable formulation (NVF) of Leonard and total variation diminishing (TVD) constraints of Harten. this paper presents an extension of it previous work by the authors for solving unsteady incompressible flow problems. The main contributions of the paper are threefold. First, it presents the results of the development and implementation of a bounded high order upwind adaptative QUICKEST scheme in the 3D robust code (Freeflow), for the numerical solution of the full incompressible Navier-Stokes equations. Second, it reports numerical simulation results for 1D hock tube problem, 2D impinging jet and 2D/3D broken clam flows. Furthermore, these results are compared with existing analytical and experimental data. And third, it presents the application of the numerical method for solving 3D free surface flow problems. (C) 2007 IMACS. Published by Elsevier B.V. All rights reserved,

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The need of efficient (fast and low consumption) optoelectronic devices has always been the driving force behind the investigation of materials with new or improved properties. To be commercially attractive, however, these materials should be compatible with our current micro-electronics industry and/or telecommunications system. Silicon-based compounds, with their matured processing technology and natural abundance, partially comply with such requirements-as long as they emit light. Motivated by these issues, this work reports on the optical properties of amorphous Si films doped with Fe. The films were prepared by sputtering a Si+Fe target and were investigated by different spectroscopic techniques. According to the experimental results, both the Fe concentration and the thermal annealing of the samples induce changes in their atomic structure and optical-electronic properties. In fact, after thermal annealing at similar to 750 degrees C, the samples partially crystallize with the development of Si and/or beta-FeSi(2) crystallites. In such a case, certain samples present light emission at similar to 1500 nm that depends on the presence of beta-FeSi(2) crystallites and is very sensitive to the annealing conditions. The most likely reasons for the light emission (or absence of it) in the considered Fe-doped Si samples are presented and discussed in view of their main structural-electronic characteristics. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper examines the extensive regions of Proterozoic accretionary belts that either formed most of the Amazonian Craton, or are marginal to its southeastern border. Their overall geodynamic significance is considered taking into account the paleogeographic reconstruction of Columbia, Rodinia and Gondwana. Amazonia would be part of Columbia together With Laurentia, North China and Baltica, forming a continuous, continental landmass linked by the Paleo- to Mesoproterozoic mobile belts that constitute large portions of it. The Rodinia supercontinent was formed in the Mesoproterozoic by the agglutination of the existing cratonic fragments, such as Laurentia and Amazonia, during contemporary continental collisions worldwide. The available paleomagnetic data suggest that Laurentia and Amazonia remained attached until at least 600 Ma. Since all other cratonic units Surrounding Laurentia have already rifted away by that time, the separation between Amazonia and Laurentia marks the final break-up of Rodinia with the opening of the lapetus ocean. (C) 2009 International Association for Gondwana Research. Published by Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We study here when the composite of it irreducible morphisms in almost sectional paths is non-zero and lies in Rn+1 (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Despite the therapeutic potential of tempol (4-hydroxy-2,2,6,6-tetra-methyl-1-piperidinyloxy) and related nitroxides as antioxidants, their effects on peroxidase-mediated protein tyrosine nitration remain unexplored. This posttranslational protein modification is a biomarker of nitric oxide-derived oxidants, and, relevantly, it parallels tissue injury in animal models of inflammation and is attenuated by tempol treatment. Here, we examine tempol effects on ribonuclease (RNase) nitration mediated by myeloperoxidase (MPO), a mammalian enzyme that plays a central role in various inflammatory processes.. Some experiments were also performed with horseradish peroxidase (HRP). We show that tempol efficiently inhibits peroxidase-mediated RNase nitration. For instance, 10 mu M tempol was able to inhibit by 90% the yield of 290 mu M 3-nitrotyrosine produced from 370 mu M RNase. The effect of tempol was not completely catalytic because part of it was consumed by recombination with RNase-tyrosyl radicals. The second-order rate constant of the reaction of tempol with MPO compound I and 11 were determined by stopped-flow kinetics as 3.3 x 10(6) and 2.6 x 10(4) M-1 s(-1), respectively (pH 7.4, 25 degrees C); the corresponding HRP constants were orders of magnitude smaller. Time-dependent hydrogen peroxide and nitrite consumption and oxygen production in the incubations were quantified experimentally and modeled by kinetic simulations. The results indicate that tempol inhibits peroxidase-mediated RNase nitration mainly because of its reaction with nitrogen dioxide to produce the oxammonium cation, which, in turn, recycles back to tempol by reacting with hydrogen peroxide and superoxide radical to produce oxygen and regenerate nitrite. The implications for nitroxide antioxidant mechanisms are discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

DD K is an antimicrobial peptide previously isolated from the skin of the amphibian Phyllomedusa distincta. The effect of cholesterol on synthetic DD K binding to egg lecithin liposomes was investigated by intrinsic fluorescence of tryptophan residue, measurements of kinetics of 5(6)-carboxyfluorescein (CF) leakage, dynamic light scattering and isothermal titration microcalorimetry. An 8 nm blue shift of tryptophan maximum emission fluorescence was observed when DD K was in the presence of lecithin liposomes compared to the value observed for liposomes containing 43 mol% cholesterol. The rate and the extent of CF release were also significantly reduced by the presence of cholesterol. Dynamic light scattering showed that lecithin liposome size increase from 115 to 140 nm when titrated with DD K but addition of cholesterol reduces the liposome size increments. Isothermal titration microcalorimetry studies showed that DD K binding both to liposomes containing cholesterol as to liposomes devoid of it is more entropically than enthalpically favored. Nevertheless, the peptide concentration necessary to furnish an adjustable titration curve is much higher for liposomes containing cholesterol at 43 mol% (2 mmol L-1) than in its absence (93 mu mol L-1). Apparent binding constant values were 2160 and 10,000 L mol(-1), respectively. The whole data indicate that DD K binding to phosphatidylcholine liposomes is significantly affected by cholesterol, which contributes to explain the low hemolytic activity of the peptide. (C) 2007 Elsevier Inc. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The main objective for this degree project is to implement an Application Availability Monitoring (AAM) system named Softek EnView for Fujitsu Services. The aim of implementing the AAM system is to proactively identify end user performance problems, such as application and site performance, before the actual end users experience them. No matter how well applications and sites are designed and nomatter how well they meet business requirements, they are useless to the end users if the performance is slow and/or unreliable. It is important for the customers to find out whether the end user problems are caused by the network or application malfunction. The Softek EnView was comprised of the following EnView components: Robot, Monitor, Reporter, Collector and Repository. The implemented system, however, is designed to use only some of these EnView elements: Robot, Reporter and depository. Robots can be placed at any key user location and are dedicated to customers, which means that when the number of customers increases, at the sametime the amount of Robots will increase. To make the AAM system ideal for the company to use, it was integrated with Fujitsu Services’ centralised monitoring system, BMC PATROL Enterprise Manager (PEM). That was actually the reason for deciding to drop the EnView Monitor element. After the system was fully implemented, the AAM system was ready for production. Transactions were (and are) written and deployed on Robots to simulate typical end user actions. These transactions are configured to run with certain intervals, which are defined collectively with customers. While they are driven against customers’ applicationsautomatically, transactions collect availability data and response time data all the time. In case of a failure in transactions, the robot immediately quits the transactionand writes detailed information to a log file about what went wrong and which element failed while going through an application. Then an alert is generated by a BMC PATROL Agent based on this data and is sent to the BMC PEM. Fujitsu Services’ monitoring room receives the alert, reacts to it according to the incident management process in ITIL and by alerting system specialists on critical incidents to resolve problems. As a result of the data gathered by the Robots, weekly reports, which contain detailed statistics and trend analyses of ongoing quality of IT services, is provided for the Customers.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The pulp- and paper production is a very energy intensive industry sector. Both Sweden and the U.S. are major pulpandpaper producers. This report examines the energy and the CO2-emission connected with the pulp- and paperindustry for the two countries from a lifecycle perspective.New technologies make it possible to increase the electricity production in the integrated pulp- andpaper mill through black liquor gasification and a combined cycle (BLGCC). That way, the mill canproduce excess electricity, which can be sold and replace electricity produced in power plants. In thisprocess the by-products that are formed at the pulp-making process is used as fuel to produce electricity.In pulp- and paper mills today the technology for generating energy from the by-product in aTomlinson boiler is not as efficient as it could be compared to the BLGCC technology. Scenarios havebeen designed to investigate the results from using the BLGCC technique using a life cycle analysis.Two scenarios are being represented by a 1994 mill in the U.S. and a 1994 mill in Sweden.The scenariosare based on the average energy intensity of pulp- and paper mills as operating in 1994 in the U.S.and Sweden respectively. The two other scenarios are constituted by a »reference mill« in the U.S. andSweden using state-of-the-art technology. We investigate the impact of varying recycling rates and totalenergy use and CO2-emissions from the production of printing and writing paper. To economize withthe wood and that way save trees, we can use the trees that are replaced by recycling in a biomassgasification combined cycle (BIGCC) to produce electricity in a power station. This produces extra electricitywith a lower CO2 intensity than electricity generated by, for example, coal-fired power plants.The lifecycle analysis in this thesis also includes the use of waste treatment in the paper lifecycle. Both Sweden and theU.S. are countries that recycle paper. Still there is a lot of paper waste, this paper is a part of the countries municipalsolid waste (MSW). A lot of the MSW is landfilled, but parts of it are incinerated to extract electricity. The thesis hasdesigned special scenarios for the use of MSW in the lifecycle analysis.This report is studying and comparing two different countries and two different efficiencies on theBLGCC in four different scenarios. This gives a wide survey and points to essential parameters to specificallyreflect on, when making assumptions in a lifecycle analysis. The report shows that there arethree key parameters that have to be carefully considered when making a lifecycle analysis of wood inan energy and CO2-emission perspective in the pulp- and paper mill in the U.S. and in Sweden. First,there is the energy efficiency in the pulp- and paper mill, then the efficiency of the BLGCC and last theCO2 intensity of the electricity displaced by BIGCC or BLGCC generatedelectricity. It also show that with the current technology that we havetoday, it is possible to produce CO2 free paper with a waste paper amountup to 30%. The thesis discusses the system boundaries and the assumptions.Further and more detailed research, including amongst others thesystem boundaries and forestry, is recommended for more specificanswers.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Video exposure monitoring (VEM) is a group of methods used for occupational hygiene studies. The method is based on a combined use of video recordings with measurements taken with real-time monitoring instruments. A commonly used name for VEM is PIMEX. Since PIMEX initially was invented in the mid 1980’s have the method been implemented and developed in a number of countries. With the aim to give an updated picture of how VEM methods are used and to investigate needs for further development have a number of workshops been organised in Finland, UK, the Netherlands, Germany and Austria. Field studies have also been made with the aim to study to what extent the PIMEX method can improve workers motivation to actively take part in actions aimed at workplace improvements.The results from the workshops illustrates clearly that there is an impressive amount of experiences and ideas for the use of VEM within the network of the groups participating in the workshops. The sharing of these experiences between the groups, as well as dissemination of it to wider groups is, however, limited. The field studies made together with a number of welders indicate that their motivation to take part in workplace improvements is improved after the PIMEX intervention. The results are however not totally conclusive and further studies focusing on motivation are called for.It is recommended that strategies for VEM, for interventions in single workplaces, as well as for exposure categorisation and production of training material are further developed. It is also recommended to conduct a research project with the intention of evaluating the effects of the use of VEM as well as to disseminate knowledge about the potential of VEM to occupational hygiene experts and others who may benefit from its use.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents a two-step pseudo likelihood estimation technique for generalized linear mixed models with the random effects being correlated between groups. The core idea is to deal with the intractable integrals in the likelihood function by multivariate Taylor's approximation. The accuracy of the estimation technique is assessed in a Monte-Carlo study. An application of it with a binary response variable is presented using a real data set on credit defaults from two Swedish banks. Thanks to the use of two-step estimation technique, the proposed algorithm outperforms conventional pseudo likelihood algorithms in terms of computational time.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A customer is presumed to gravitate to a facility by the distance to it and the attractiveness of it. However regarding the location of the facility, the presumption is that the customer opts for the shortest route to the nearest facility.This paradox was recently solved by the introduction of the gravity p-median model. The model is yet to be implemented and tested empirically. We implemented the model in an empirical problem of locating locksmiths, vehicle inspections, and retail stores ofv ehicle spare-parts, and we compared the solutions with those of the p-median model. We found the gravity p-median model to be of limited use for the problem of locating facilities as it either gives solutions similar to the p-median model, or it gives unstable solutions due to a non-concave objective function.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

I dagens samhälle är det allt viktigare för företag att behålla sina existerande kunder då konkurrensen blir allt hårdare. Detta medför att företag försöker vidta åtgärder för att vårda relationer med sina kunder. Detta problem är även högst relevant inom IT-branschen. Inom IT-branschen är det vanligt att arbeta agilt i IT-projekt. Vår samarbetspartner har sett ett ökat behov av att mäta servicekvalitet på ett återkommande sätt inom IT-projekt, detta för att mäta relevanta variabler som sträcker sig utanför kravspecifikationen. För att mäta framgång gällande detta arbetssätt vill man kunna mäta Nöjd Kund Index (NKI) för att kunna jämföra IT-projekt internt i företaget. Då tidigare forskning visat avsaknad av modeller innehållande både mätning av servicekvalitet samt NKI har lämplig litteratur studerats där det framkommit att modellen SERVQUAL är vedertagen för mätning av servicekvalitet och modellen American Customer Satisfaction Index (ACSI) är vedertagen för mätning av NKI. Detta har legat till grund för arbetets problemformulering och syfte. Syftet med arbetet är att skapa en vidareutvecklad modell för mätning av NKI för att jämföra IT-projekt internt samt återkommande mätning av servicekvalitet inom IT-projekt. Framtagande av denna modell har sedan skett genom forskningsstrategin Design and Creation. Intervjuer har genomförts för kravfångst till den vidareutvecklade modellen. Resultatet av denna forskningsstrategi blev sedan en vidareutvecklad modell baserad på ovan nämnda modeller med återkommande förhållningssätt för mätning av servicekvalitet inom IT-projekt och mätning av NKI för att jämföra IT-projekt internt i företaget. Den framtagna modellen har sedan verifierats genom ytterligare intervjuer med respondenter som innehar god erfarenhet från kundsidan av IT-projekt. Från dessa intervjuer kunde sedan slutsats dras att denna modell är att anse som applicerbar i empirin gällande IT-projekt.