925 resultados para Standalone System with Back up


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In his study - File Control: The Heart Of Business Computer Management - William G. O'Brien, Assistant Professor, The School of Hospitality Management at Florida International University, initially informs you: “Even though computers are an everyday part of the hospitality industry, many managers lack the knowledge and experience to control and protect the files in these systems. The author offers guidelines which can minimize or prevent damage to the business as a whole.” Our author initially opens this study with some anecdotal instances illustrating the failure of hospitality managers to exercise due caution with regard to computer supported information systems inside their restaurants and hotels. “Of the three components that make up any business computer system (data files, programs, and hard-ware), it is files that are most important, perhaps irreplaceable, to the business,” O’Brien informs you. O’Brien breaks down the noun, files, into two distinct categories. They are, the files of extrinsic value, and its counterpart the files of intrinsic value. An example of extrinsic value files would be a restaurant’s wine inventory. “As sales are made and new shipments are received, the computer updates the file,” says O’Brien. “This information might come directly from a point-of-sale terminal or might be entered manually by an employee,” he further explains. On the intrinsic side of the equation, O’Brien wants you to know that the information itself is the valuable part of this type of file. Its value is over and above the file’s informational purpose as a pragmatic business tool, as it is in inventory control. “The information is money in the legal sense For instance, figures moved about in banking system computers do not represent dollars; they are dollars,” O’Brien explains. “If the record of a dollar amount is erased from all computer files, then that money ceases to exist,” he warns. This type of information can also be bought and sold, such as it is in customer lists to advertisers. Files must be protected O’Brien stresses. “File security requires a systematic approach,” he discloses. O’Brien goes on to explain important elements to consider when evaluating file information. File back-up is also an important factor to think about, along with file storage/safety concerns. “Sooner or later, every property will have its fire, flood, careless mistake, or disgruntled employee,” O’Brien closes. “…good file control can minimize or prevent damage to the business as a whole.”

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Duke Free-electron laser (FEL) system, driven by the Duke electron storage ring, has been at the forefront of developing new light source capabilities over the past two decades. In 1999, the Duke FEL demonstrated the first lasing of a storage ring FEL in the vacuum ultraviolet (VUV) region at $194$ nm using two planar OK-4 undulators. With two helical undulators added to the outboard sides of the planar undulators, in 2005 the highest FEL gain ($47.8\%$) of a storage ring FEL was achieved using the Duke FEL system with a four-undulator configuration. In addition, the Duke FEL has been used as the photon source to drive the High Intensity $\gamma$-ray Source (HIGS) via Compton scattering of the FEL beam and electron beam inside the FEL cavity. Taking advantage of FEL's wavelength tunability as well as the adjustability of the energy of the electron beam in the storage ring, the nearly monochromatic $\gamma$-ray beam has been produced in a wide energy range from $1$ to $100$ MeV at the HIGS. To further push the FEL short wavelength limit and enhance the FEL gain in the VUV regime for high energy $\gamma$-ray production, two additional helical undulators were installed in 2012 using an undulator switchyard system to allow switching between the two planar and two helical undulators in the middle section of the FEL system. Using different undulator configurations made possible by the switchyard, a number of novel capabilities of the storage ring FEL have been developed and exploited for a wide FEL wavelength range from infrared (IR) to VUV. These new capabilities will eventually be made available to the $\gamma$-ray operation, which will greatly enhance the $\gamma$-ray user research program, creating new opportunities for certain types of nuclear physics research.

With the wide wavelength tuning range, the FEL is an intrinsically well-suited device to produce lasing with multiple colors. Taking advantage of the availability of an undulator system with multiple undulators, we have demonstrated the first two-color lasing of a storage ring FEL. Using either a three- or four-undulator configuration with a pair of dual-band high reflectivity mirrors, we have achieved simultaneous lasing in the IR and UV spectral regions. With the low-gain feature of the storage ring FEL, the power generated at the two wavelengths can be equally built up and precisely balanced to reach FEL saturation. A systematic experimental program to characterize this two-color FEL has been carried out, including precise power control, a study of the power stability of two-color lasing, wavelength tuning, and the impact of the FEL mirror degradation. Using this two-color laser, we have started to develop a new two-color $\gamma$-ray beam for scientific research at the HIGS.

Using the undulator switchyard, four helical undulators installed in the beamline can be configured to not only enhance the FEL gain in the VUV regime, but also allow for the full polarization control of the FEL beams. For the accelerator operation, the use of helical undulators is essential to extend the FEL mirror lifetime by reducing radiation damage from harmonic undulator radiation. Using a pair of helical undulators with opposite helicities, we have realized (1) fast helicity switching between left- and right-circular polarizations, and (2) the generation of fully controllable linear polarization. In order to extend these new capabilities of polarization control to the $\gamma$-ray operation in a wide energy range at the HIGS, a set of FEL polarization diagnostic systems need to be developed to cover the entire FEL wavelength range. The preliminary development of the polarization diagnostics for the wavelength range from IR to UV has been carried out.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Scientists planning to use underwater stereoscopic image technologies are often faced with numerous problems during the methodological implementations: commercial equipment is too expensive; the setup or calibration is too complex; or the imaging processing (i.e. measuring objects in the stereo-images) is too complicated to be performed without a time-consuming phase of training and evaluation. The present paper addresses some of these problems and describes a workflow for stereoscopic measurements for marine biologists. It also provides instructions on how to assemble an underwater stereo-photographic system with two digital consumer cameras and gives step-by-step guidelines for setting up the hardware. The second part details a software procedure to correct stereo-image pairs for lens distortions, which is especially important when using cameras with non-calibrated optical units. The final part presents a guide to the process of measuring the lengths (or distances) of objects in stereoscopic image pairs. To reveal the applicability and the restrictions of the described systems and to test the effects of different types of camera (a compact camera and an SLR type), experiments were performed to determine the precision and accuracy of two generic stereo-imaging units: a diver-operated system based on two Olympus Mju 1030SW compact cameras and a cable-connected observatory system based on two Canon 1100D SLR cameras. In the simplest setup without any correction for lens distortion, the low-budget Olympus Mju 1030SW system achieved mean accuracy errors (percentage deviation of a measurement from the object's real size) between 10.2 and -7.6% (overall mean value: -0.6%), depending on the size, orientation and distance of the measured object from the camera. With the single lens reflex (SLR) system, very similar values between 10.1% and -3.4% (overall mean value: -1.2%) were observed. Correction of the lens distortion significantly improved the mean accuracy errors of either system. Even more, system precision (spread of the accuracy) improved significantly in both systems. Neither the use of a wide-angle converter nor multiple reassembly of the system had a significant negative effect on the results. The study shows that underwater stereophotography, independent of the system, has a high potential for robust and non-destructive in situ sampling and can be used without prior specialist training.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Dickson Land peninsula is located in central West-Spitsbergen between the NNE branches of Isfjorden. The climatic firn line lying at 500 m causes plateau glaciers with outlet tongues which are characteristic of S-Dickson Land. The distribution of valley glaciers and the variations of the orographic firn line depend on wind direction. In comparing the firn lines established by the methods of LICHTENECKER (1938) and VISSER (1938), to the values calculated by the method of v. HÖFER (1879), differences of up to l07 m are found. These differences may depend on the inclination and distance relationships of the glaciers above and below the real firn lines. During the latest glacial advance, Dickson Land was located on the peripheries of two local glaciation centers. At that time an inland glaciation of West-Spitsbergen did not exist . The formation of a subglacial channel system dates back to the maximum extent of the late glacial phase before 17500 B.P, (+2000/-1375 years). A correlation of postglacial stadia and 14C dated marine terraces (FEYLING-HANSSEN & OLSSON, 1960; FEYLING-HANSSEN, 1965) is possible. Considering isostatic movement and the difference between calculated and real firn lines, a postglacial stadium at about 10400 B. P. can be reconstructed with a firn line lying 265 m above former sea level. On average, the absolute depression below the recent firn line amounted to 246 m. Stagnation at 9650 B.P. coincided with a firn line at 315 m above former sea level and a depression of 173 m. Around 1890 A.D., glacial fluctuations corresponded to a firn line at 415 m (depression: 64 m). To some extent the morphology of the main valleys appears to depend on structure and petrography. Therefore their value as indicators of former glaciations is questionable. The periglacial forms are shown on a large-scale map. At the time of the "Holocene warm interval", between 7000 and 2000 B.P. (FEYLING-HANSSEN, 1955a, 1965), an increase of periglacial activity seems likely. This can be explained by a simultaneous increase in the depth of the active layer in both soil and bedrock.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a model for availability analysis of standalone hybrid microgrid. The microgrid used in the study consists of wind, solar storage and diesel generator. Boolean driven Markov process is used to develop the availability of the system in the proposed method. By modifying the developed model, the relationship between the availability of the system with the fine (normal) weather and disturbed (stormy) weather durations are analyzed. Effects of different converter technologies on the availability of standalone microgrid were investigated and the results have shown that the availability of microgrid increased by 5.80 % when a storage system is added. On the other hand, the availability of standalone microgrid could be overestimated by 3.56 % when weather factor is neglected. In the same way 200, 500 and 1000 hours of disturbed weather durations reduced the availability of the system by 5.36%, 9.73% and 13.05 %, respectively. In addition, the hybrid energy storage cascade topology with a capacitor in the middle maximized the system availability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Safety on public transport is a major concern for the relevant authorities. We
address this issue by proposing an automated surveillance platform which combines data from video, infrared and pressure sensors. Data homogenisation and integration is achieved by a distributed architecture based on communication middleware that resolves interconnection issues, thereby enabling data modelling. A common-sense knowledge base models and encodes knowledge about public-transport platforms and the actions and activities of passengers. Trajectory data from passengers is modelled as a time-series of human activities. Common-sense knowledge and rules are then applied to detect inconsistencies or errors in the data interpretation. Lastly, the rationality that characterises human behaviour is also captured here through a bottom-up Hierarchical Task Network planner that, along with common-sense, corrects misinterpretations to explain passenger behaviour. The system is validated using a simulated bus saloon scenario as a case-study. Eighteen video sequences were recorded with up to six passengers. Four metrics were used to evaluate performance. The system, with an accuracy greater than 90% for each of the four metrics, was found to outperform a rule-base system and a system containing planning alone.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The value of integrating a heat storage into a geothermal district heating system has been investigated. The behaviour of the system under a novel operational strategy has been simulated focusing on the energetic, economic and environmental effects of the new strategy of incorporation of the heat storage within the system. A typical geothermal district heating system consists of several production wells, a system of pipelines for the transportation of the hot water to end-users, one or more re-injection wells and peak-up devices (usually fossil-fuel boilers). Traditionally in these systems, the production wells change their production rate throughout the day according to heat demand, and if their maximum capacity is exceeded the peak-up devices are used to meet the balance of the heat demand. In this study, it is proposed to maintain a constant geothermal production and add heat storage into the network. Subsequently, hot water will be stored when heat demand is lower than the production and the stored hot water will be released into the system to cover the peak demands (or part of these). It is not intended to totally phase-out the peak-up devices, but to decrease their use, as these will often be installed anyway for back-up purposes. Both the integration of a heat storage in such a system as well as the novel operational strategy are the main novelties of this thesis. A robust algorithm for the sizing of these systems has been developed. The main inputs are the geothermal production data, the heat demand data throughout one year or more and the topology of the installation. The outputs are the sizing of the whole system, including the necessary number of production wells, the size of the heat storage and the dimensions of the pipelines amongst others. The results provide several useful insights into the initial design considerations for these systems, emphasizing particularly the importance of heat losses. Simulations are carried out for three different cases of sizing of the installation (small, medium and large) to examine the influence of system scale. In the second phase of work, two algorithms are developed which study in detail the operation of the installation throughout a random day and a whole year, respectively. The first algorithm can be a potentially powerful tool for the operators of the installation, who can know a priori how to operate the installation on a random day given the heat demand. The second algorithm is used to obtain the amount of electricity used by the pumps as well as the amount of fuel used by the peak-up boilers over a whole year. These comprise the main operational costs of the installation and are among the main inputs of the third part of the study. In the third part of the study, an integrated energetic, economic and environmental analysis of the studied installation is carried out together with a comparison with the traditional case. The results show that by implementing heat storage under the novel operational strategy, heat is generated more cheaply as all the financial indices improve, more geothermal energy is utilised and less fuel is used in the peak-up boilers, with subsequent environmental benefits, when compared to the traditional case. Furthermore, it is shown that the most attractive case of sizing is the large one, although the addition of the heat storage most greatly impacts the medium case of sizing. In other words, the geothermal component of the installation should be sized as large as possible. This analysis indicates that the proposed solution is beneficial from energetic, economic, and environmental perspectives. Therefore, it can be stated that the aim of this study is achieved in its full potential. Furthermore, the new models for the sizing, operation and economic/energetic/environmental analyses of these kind of systems can be used with few adaptations for real cases, making the practical applicability of this study evident. Having this study as a starting point, further work could include the integration of these systems with end-user demands, further analysis of component parts of the installation (such as the heat exchangers) and the integration of a heat pump to maximise utilisation of geothermal energy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim: To evaluate the dislocation resistance of the quartz fiber post/cement/dentin interface after different adhesion strategies. Methods: Forty bovine lower central incisors were selected and prepared with K-files using the step-back technique, and irrigated with 3 mL of distilled water preceding the use of each instrument. Prepared teeth were stored at 37ºC and 100% humidity for 7 days. The roots were prepared and randomized into 4 groups. The quartz fiber post was cemented with an adhesion strategy according to the following groups: GBisCem- BISCEM; GOneStep±C&B- One Step ± C&B; GAllBond±C&B- AllBond3 ± C&B; GAllBondSE±C&B- AllBondSE ±C&B with a quartz fiber post. Cross-sectional root slices of 0.7 mm were produced and stored for 24 h at 37° C before being submitted to push-out bond strength. Results: The mean and standard deviation values of dislocation resistance were GBisCem: 1.12 (± 0.23) MPa, GOneStep±C&B: 0.81 (± 0.31) MPa, GAllBond±C&B: 0.98 (± 0.14) MPa, and GAllBondSE±C&B: 1.57 (± 0.04) MPa. GAllBondSE±C&B showed significantly higher values of dislocation resistance than the other groups. Conclusions: Based on this study design, it may be concluded that adhesion strategies showed different results of quartz post dislocation resistance. Simplified adhesive system with sodium benzene sulphinate incorporation provided superior dislocation resistance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents a study of the Grid data access patterns in distributed analysis in the CMS experiment at the LHC accelerator. This study ranges from the deep analysis of the historical patterns of access to the most relevant data types in CMS, to the exploitation of a supervised Machine Learning classification system to set-up a machinery able to eventually predict future data access patterns - i.e. the so-called dataset “popularity” of the CMS datasets on the Grid - with focus on specific data types. All the CMS workflows run on the Worldwide LHC Computing Grid (WCG) computing centers (Tiers), and in particular the distributed analysis systems sustains hundreds of users and applications submitted every day. These applications (or “jobs”) access different data types hosted on disk storage systems at a large set of WLCG Tiers. The detailed study of how this data is accessed, in terms of data types, hosting Tiers, and different time periods, allows to gain precious insight on storage occupancy over time and different access patterns, and ultimately to extract suggested actions based on this information (e.g. targetted disk clean-up and/or data replication). In this sense, the application of Machine Learning techniques allows to learn from past data and to gain predictability potential for the future CMS data access patterns. Chapter 1 provides an introduction to High Energy Physics at the LHC. Chapter 2 describes the CMS Computing Model, with special focus on the data management sector, also discussing the concept of dataset popularity. Chapter 3 describes the study of CMS data access patterns with different depth levels. Chapter 4 offers a brief introduction to basic machine learning concepts and gives an introduction to its application in CMS and discuss the results obtained by using this approach in the context of this thesis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This chapter aims to develop a new method for the economical evaluation of Hybrid Systems for electricity production. The different types of renewable sources are specifically evaluated in the economical performance of the overall equipment. The presented methodology was applied to evaluate the design of a photovoltaic-wind-diesel hybrid system to produce electricity for a community in the neighbourhood of Luanda, Angola. Once the hybrid generator is selected, it is proposed to provide the system with a supervisory control strategy to maximize its operating efficiency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the semiarid region of Brazil the use of irrigation systems for applying fertilizers in horticulture is the primary means for incorporating nutrients in the soil. However, this technique still requires its use in wine vines to be assessed. In view of this, this study aimed to assess nitrate and potassium concentrations in soil fertigated with nitrogen and potassium fertilizers in 3 wine grape growing cycles. A field experiment was conducted with ?Syrah? wine grapes, in Petrolina, Pernambuco, Brazil; it assessed five nitrogen doses (0, 15, 30, 60 and 120 kg ha-1) and five K2O doses (0, 15, 30, 60 and 120 kg ha-1) applied by drip irrigation system with two emitters per plant, with a flow rate of 4 L h-1. The experimental design used was the factorial split-plot, making up 13 combinations arranged in 4 randomized blocks. Soil solution samples were collected weekly with the aid of porous cup extractors for all treatments and at depths of 0.4 and 0.6 m by determining nitrate and potassium concentrations and electrical conductivity. Increased levels of both nutrients in the irrigation water increased the availability of nitrate and potassium in the soil solution. The highest nitrate and potassium concentrations were found in the second growing cycle at both depths studied.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advanced Driver Assistance Systems (ADAS) are proving to have huge potential in road safety, comfort, and efficiency. In recent years, car manufacturers have equipped their high-end vehicles with Level 2 ADAS, which are, according to SAE International, systems that combine both longitudinal and lateral active motion control. These automated driving features, while only available in highway scenarios, appear to be very promising towards the introduction of hands-free driving. However, as they rely only on an on-board sensor suite, their continuative operation may be affected by the current environmental conditions: this prevents certain functionalities such as the automated lane change, other than requiring the driver to keep constantly the hands on the steering wheel. The enabling factor for hands-free highway driving proposed by Mobileye is the integration of high-definition maps, thus leading to the so-called Level 2+. This thesis was carried out during an internship in Maserati's Virtual Engineering team. The activity consisted of the design of an L2+ Highway Assist System following the Rapid Control Prototyping approach, starting from the definition of the requirements up to the real-time implementation and testing on a simulator of the brand new compact SUV Maserati Grecale. The objective was to enhance the current Level 2 highway driving assistance system with hands-free driving capability; for this purpose an Autonomous Lane Change functionality has been designed, proposing a Model Predictive Control-based decision-maker, in charge of assessing both the feasibility and convenience of performing a lane-change maneuver. The result is a Highway Assist System capable of driving the vehicle in a traffic scenario safely and efficiently, never requiring driver intervention.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, global supply chains have increasingly suffered from reliability issues due to various external and difficult to-manage events. The following paper aims to build an integrated approach for the design of a Supply Chain under the risk of disruption and demand fluctuation. The study is divided in two parts: a mathematical optimization model, to identify the optimal design and assignments customer-facility, and a discrete-events simulation of the resulting network. The first one describes a model in which plant location decisions are influenced by variables such as distance to customers, investments needed to open plants and centralization phenomena that help contain the risk of demand variability (Risk Pooling). The entire model has been built with a proactive approach to manage the risk of disruptions assigning to each customer two types of open facilities: one that will serve it under normal conditions and a back-up facility, which comes into operation when the main facility has failed. The study is conducted on a relatively small number of instances due to the computational complexity, a matheuristic approach can be found in part A of the paper to evaluate the problem with a larger set of players. Once the network is built, a discrete events Supply Chain simulation (SCS) has been implemented to analyze the stock flow within the facilities warehouses, the actual impact of disruptions and the role of the back-up facilities which suffer a great stress on their inventory due to a large increase in demand caused by the disruptions. Therefore, simulation follows a reactive approach, in which customers are redistributed among facilities according to the interruptions that may occur in the system and to the assignments deriving from the design model. Lastly, the most important results of the study will be reported, analyzing the role of lead time in a reactive approach for the occurrence of disruptions and comparing the two models in terms of costs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite the remarkable improvements in breast cancer (BC) characterization, accurate prediction of BC clinical behavior is often still difficult to achieve. Some studies have investigated the association between the molecular subtype, namely the basal-like BC and the pattern of relapse, however only few investigated the association between relapse pattern and immunohistochemical defined triple-negative breast cancers (TNBCs). The aim of this study was to evaluate the pattern of relapse in patients with TNBC, namely the primary distant relapse site. One-hundred twenty nine (129) invasive breast carcinomas with follow-up information were classified according to the molecular subtype using immunohistochemistry for ER, PgR and Her2. The association between TNBC and distant relapse primary site was analyzed by logistic regression. Using multivariate logistic regression analysis patients with TNBC displayed only 0.09 (95% CI: 0.00-0.74; p=0.02) the odds of the non-TNBC patients of developing bone primary relapse. Regarding visceral and lymph-node relapse, no differences between in this cohort were found. Though classically regarded as aggressive tumors, TNBCs rarely development primary relapse in bone when compared to non-TNBC, a clinical relevant fact when investigating a metastasis of an occult or non-sampled primary BC.