985 resultados para Established system


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Small errors proved catastrophic. Our purpose to remark that a very small cause which escapes our notice determined a considerable effect that we cannot fail to see, and then we say that the effect is due to chance. Small differences in the initial conditions produce very great ones in the final phenomena. A small error in the former will produce an enormous error in the latter. When dealing with any kind of electrical device specification, it is important to note that there exists a pair of test conditions that define a test: the forcing function and the limit. Forcing functions define the external operating constraints placed upon the device tested. The actual test defines how well the device responds to these constraints. Forcing inputs to threshold for example, represents the most difficult testing because this put those inputs as close as possible to the actual switching critical points and guarantees that the device will meet the Input-Output specifications. ^ Prediction becomes impossible by classical analytical analysis bounded by Newton and Euclides. We have found that non linear dynamics characteristics is the natural state of being in all circuits and devices. Opportunities exist for effective error detection in a nonlinear dynamics and chaos environment. ^ Nowadays there are a set of linear limits established around every aspect of a digital or analog circuits out of which devices are consider bad after failing the test. Deterministic chaos circuit is a fact not a possibility as it has been revived by our Ph.D. research. In practice for linear standard informational methodologies, this chaotic data product is usually undesirable and we are educated to be interested in obtaining a more regular stream of output data. ^ This Ph.D. research explored the possibilities of taking the foundation of a very well known simulation and modeling methodology, introducing nonlinear dynamics and chaos precepts, to produce a new error detector instrument able to put together streams of data scattered in space and time. Therefore, mastering deterministic chaos and changing the bad reputation of chaotic data as a potential risk for practical system status determination. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Security remains a top priority for organizations as their information systems continue to be plagued by security breaches. This dissertation developed a unique approach to assess the security risks associated with information systems based on dynamic neural network architecture. The risks that are considered encompass the production computing environment and the client machine environment. The risks are established as metrics that define how susceptible each of the computing environments is to security breaches. ^ The merit of the approach developed in this dissertation is based on the design and implementation of Artificial Neural Networks to assess the risks in the computing and client machine environments. The datasets that were utilized in the implementation and validation of the model were obtained from business organizations using a web survey tool hosted by Microsoft. This site was designed as a host site for anonymous surveys that were devised specifically as part of this dissertation. Microsoft customers can login to the website and submit their responses to the questionnaire. ^ This work asserted that security in information systems is not dependent exclusively on technology but rather on the triumvirate people, process and technology. The questionnaire and consequently the developed neural network architecture accounted for all three key factors that impact information systems security. ^ As part of the study, a methodology on how to develop, train and validate such a predictive model was devised and successfully deployed. This methodology prescribed how to determine the optimal topology, activation function, and associated parameters for this security based scenario. The assessment of the effects of security breaches to the information systems has traditionally been post-mortem whereas this dissertation provided a predictive solution where organizations can determine how susceptible their environments are to security breaches in a proactive way. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The need to incorporate advanced engineering tools in biology, biochemistry and medicine is in great demand. Many of the existing instruments and tools are usually expensive and require special facilities.^ With the advent of nanotechnology in the past decade, new approaches to develop devices and tools have been generated by academia and industry. ^ One such technology, NMR spectroscopy, has been used by biochemists for more than 2 decades to study the molecular structure of chemical compounds. However, NMR spectrometers are very expensive and require special laboratory rooms for their proper operation. High magnetic fields with strengths in the order of several Tesla make these instruments unaffordable to most research groups.^ This doctoral research proposes a new technology to develop NMR spectrometers that can operate at field strengths of less than 0.5 Tesla using an inexpensive permanent magnet and spin dependent nanoscale magnetic devices. This portable NMR system is intended to analyze samples as small as a few nanoliters.^ The main problem to resolve when downscaling the variables is to obtain an NMR signal with high Signal-To-Noise-Ratio (SNR). A special Tunneling Magneto-Resistive (TMR) sensor design was developed to achieve this goal. The minimum specifications for each component of the proposed NMR system were established. A complete NMR system was designed based on these minimum requirements. The goat was always to find cost effective realistic components. The novel design of the NMR system uses technologies such as Direct Digital Synthesis (DDS), Digital Signal Processing (DSP) and a special Backpropagation Neural Network that finds the best match of the NMR spectrum. The system was designed, calculated and simulated with excellent results.^ In addition, a general method to design TMR Sensors was developed. The technique was automated and a computer program was written to help the designer perform this task interactively.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation establishes a novel system for human face learning and recognition based on incremental multilinear Principal Component Analysis (PCA). Most of the existing face recognition systems need training data during the learning process. The system as proposed in this dissertation utilizes an unsupervised or weakly supervised learning approach, in which the learning phase requires a minimal amount of training data. It also overcomes the inability of traditional systems to adapt to the testing phase as the decision process for the newly acquired images continues to rely on that same old training data set. Consequently when a new training set is to be used, the traditional approach will require that the entire eigensystem will have to be generated again. However, as a means to speed up this computational process, the proposed method uses the eigensystem generated from the old training set together with the new images to generate more effectively the new eigensystem in a so-called incremental learning process. In the empirical evaluation phase, there are two key factors that are essential in evaluating the performance of the proposed method: (1) recognition accuracy and (2) computational complexity. In order to establish the most suitable algorithm for this research, a comparative analysis of the best performing methods has been carried out first. The results of the comparative analysis advocated for the initial utilization of the multilinear PCA in our research. As for the consideration of the issue of computational complexity for the subspace update procedure, a novel incremental algorithm, which combines the traditional sequential Karhunen-Loeve (SKL) algorithm with the newly developed incremental modified fast PCA algorithm, was established. In order to utilize the multilinear PCA in the incremental process, a new unfolding method was developed to affix the newly added data at the end of the previous data. The results of the incremental process based on these two methods were obtained to bear out these new theoretical improvements. Some object tracking results using video images are also provided as another challenging task to prove the soundness of this incremental multilinear learning method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Developing scientifically credible tools for measuring the success of ecological restoration projects is a difficult and a non-trivial task. Yet, reliable measures of the general health and ecological integrity of ecosystems are critical for assessing the success of restoration programs. The South Florida Ecosystem Restoration Task Force (Task Force), which helps coordinate a multi-billion dollar multi-organizational effort between federal, state, local and tribal governments to restore the Florida Everglades, is using a small set of system-wide ecological indicators to assess the restoration efforts. A team of scientists and managers identified eleven ecological indicators from a field of several hundred through a selection process using 12 criteria to determine their applicability as part of a system-wide suite. The 12 criteria are: (1) is the indicator relevant to the ecosystem? (2) Does it respond to variability at a scale that makes it applicable to the entire system? (3) Is the indicator feasible to implement and is it measureable? (4) Is the indicator sensitive to system drivers and is it predictable? (5) Is the indicator interpretable in a common language? (6) Are there situations where an optimistic trend with regard to an indicator might suggest a pessimistic restoration trend? (7) Are there situations where a pessimistic trend with regard to an indicator may be unrelated to restoration activities? (8) Is the indicator scientifically defensible? (9) Can clear, measureable targets be established for the indicator to allow for assessments of success? (10) Does the indicator have specificity to be able to result in corrective action? (11) What level of ecosystem process or structure does the indicator address? (12) Does the indicator provide early warning signs of ecological change? In addition, a two page stoplight report card was developed to assist in communicating the complex science inherent in ecological indicators in a common language for resource managers, policy makers and the public. The report card employs a universally understood stoplight symbol that uses green to indicate that targets are being met, yellow to indicate that targets have not been met and corrective action may be needed and red to represent that targets are far from being met and corrective action is required. This paper presents the scientific process and the results of the development and selection of the criteria, the indicators and the stoplight report card format and content. The detailed process and results for the individual indicators are presented in companion papers in this special issue of Ecological Indicators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Based on a well-established stratigraphic framework and 47 AMS-14C dated sediment cores, the distribution of facies types on the NW Iberian margin is analysed in response to the last deglacial sea-level rise, thus providing a case study on the sedimentary evolution of a high-energy, low-accumulation shelf system. Altogether, four main types of sedimentary facies are defined. (1) A gravel-dominated facies occurs mostly as time-transgressive ravinement beds, which initially developed as shoreface and storm deposits in shallow waters on the outer shelf during the last sea-level lowstand; (2) A widespread, time-transgressive mixed siliceous/biogenic-carbonaceous sand facies indicates areas of moderate hydrodynamic regimes, high contribution of reworked shelf material, and fluvial supply to the shelf; (3) A glaucony-containing sand facies in a stationary position on the outer shelf formed mostly during the last-glacial sea-level rise by reworking of older deposits as well as authigenic mineral formation; and (4) A mud facies is mostly restricted to confined Holocene fine-grained depocentres, which are located in mid-shelf position. The observed spatial and temporal distribution of these facies types on the high-energy, low-accumulation NW Iberian shelf was essentially controlled by the local interplay of sediment supply, shelf morphology, and strength of the hydrodynamic system. These patterns are in contrast to high-accumulation systems where extensive sediment supply is the dominant factor on the facies distribution. This study emphasises the importance of large-scale erosion and material recycling on the sedimentary buildup during the deglacial drowning of the shelf. The presence of a homogenous and up to 15-m thick transgressive cover above a lag horizon contradicts the common assumption of sparse and laterally confined sediment accumulation on high-energy shelf systems during deglacial sea-level rise. In contrast to this extensive sand cover, laterally very confined and maximal 4-m thin mud depocentres developed during the Holocene sea-level highstand. This restricted formation of fine-grained depocentres was related to the combination of: (1) frequently occurring high-energy hydrodynamic conditions; (2) low overall terrigenous input by the adjacent rivers; and (3) the large distance of the Galicia Mud Belt to its main sediment supplier.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper details a method of determining the uncertainty of dimensional measurement for a three dimensional coordinate measurement machine. An experimental procedure was developed to compare three dimensional coordinate measurements with calibrated reference points. The reference standard used to calibrate these reference points was a fringe counting interferometer with the multilateration technique employed to establish three dimensional coordinates. This is an extension of the established technique of comparing measured lengths with calibrated lengths. Specifically a distributed coordinate measurement device was tested which consisted of a network of Rotary-Laser Automatic Theodolites (R-LATs), this system is known commercially as indoor GPS (iGPS). The method was found to be practical and able to establish that the expanded uncertainty of the basic iGPS system was approximately 1 mm at a 95% confidence level. © Springer-Verlag Berlin Heidelberg 2010.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Development Permit System has been introduce with minimal directives for establishing a decision making process. This is in opposition to the long established process for minor variances and suggests that the Development Permit System does not necessarily incorporate all of Ontario’s fundamental planning principles. From this concept, the study aimed to identify how minor variances are incorporated into the Development Permit System. In order to examine this topic, the research was based around the following research questions: • How are ‘minor variance’ applications processed within the DPS? • To what extent do the four tests of a minor variance influence the outcomes of lower level applications in the DPS approval process? A case study approach was used for this research. The single-case design employed both qualitative and quantitative research methods including a review of academic literature, court cases, and official documents, as well as a content analysis of Class 1, 1A, and 2 Development Permit application files from the Town of Carleton Place that were decided between 2011 and 2015. Upon the completion of the content analysis, it was found that minor variance issues were most commonly assigned to Class 1 applications. Planning staff generally met approval timelines and embraced their delegated approval authority, readily attaching conditions to applications in order to mitigate off-site impacts. While staff met the regulatory requirements of the DPS, ‘minor variance’ applications were largely decided on impact alone, demonstrating that the principles established by the four tests, the defining quality of the minor variance approval process, had not transferred to the Development Permit System. Alternatively, there was some evidence that the development community has not fully adjusted to the requirements of the new approvals process, as some applications were supported using a rationale containing the four tests. Subsequently, a set of four recommendations were offered which reflect the main themes established by the findings. The first two recommendations are directed towards the Province, the third to municipalities and the fourth to developers and planning consultants: 1) Amend Ontario Regulation 608/06 so that provisions under Section 4(3)(e) fall under Section 4(2). 2) Change the rhetoric from “combining elements of minor variances” to “replacing minor variances”. 3) Establish clear evaluation criteria. 4) Understand the evaluative criteria of the municipality in which you are working.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ageing and deterioration of infrastructure is a challenge facing transport authorities. In particular, there is a need for increased bridge monitoring in order to provide adequate maintenance, prioritise allocation of funds and guarantee acceptable levels of transport safety. Existing bridge structural health monitoring (SHM) techniques typically involve direct instrumentation of the bridge with sensors and equipment for the measurement of properties such as frequencies of vibration. These techniques are important as they can indicate the deterioration of the bridge condition. However, they can be labour intensive and expensive due to the requirement for on-site installations. In recent years, alternative low-cost indirect vibrationbased SHM approaches have been proposed which utilise the dynamic response of a vehicle to carry out “drive-by” pavement and/or bridge monitoring. The vehicle is fitted with sensors on its axles thus reducing the need for on-site installations. This paper investigates the use of low-cost sensors incorporating global navigation satellite systems (GNSS) for implementation of the drive-by system in practice, via field trials with an instrumented vehicle. The potential of smartphone technology to be harnessed for drive by monitoring is established, while smartphone GNSS tracking applications are found to compare favourably in terms of accuracy, cost and ease of use to professional GNSS devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Outcomes Monitoring System (OMS) was established to systematically gather data on substance abuse treatment outcomes in Iowa. Randomly selected clients from 22 Iowa Department of Public Health-funded treatment agencies were contacted for follow-up interviews that occurred approximately six months after discharge from treatment. This report examines outcomes for clients admitted in calendar year 2013. Outcomes are presented for 334 of the clients who completed the follow-up interview.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN]Policyclyc aromatic hydrocarbons (PAHs) are a potential risk for human health and marine biota in general that make necessary the monitorization of them. A miniaturized extraction system capable to extract PAHs from seawater was developed and optimized with the objective of implement it in an oceanographic buoy in the future. An analytical method was optimized by high performance liquid chromatography for the determination of extracted PAHs by the extraction system. The analytical method was validated and applicated to real samples of differents points of Gran Canaria. The method has enough sensitivity to detect and quantify concentrations below the concentrations established in the legislation. In some places where samples were taken some compounds exceed the legislation while other compounds follow it

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Property and Equipment Department has a central supply of automotive parts, tools, and maintenance supplies. This central supply is used to supply the repair shop and also to supply parts to the various field garages and all departments of the Commission. The old procedure involved keeping track manually of all of the parts, which involved some 22,000 items. All records, billings, arid re-order points were kept manually. Mani times the re-order points were located by reaching into a bin and finding nothing there. Desiring to improve this situation, an inventory control system was established for use on the computer. A complete record of the supplies that are stored in the central warehouse was prepared and this information was used to make a catalog. Each time an item is issued or received, it is processed through the inventory program. When the re-order point is reached, a notice is given to reorder. The procedure for taking inventory has been improved. A voucher invoice is now prepared by the computer for all issues to departments. These are some of the many benefits that have been de rived from this system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

I explore transformative social innovation in agriculture through a particular case of agroecological innovation, the System of Rice Intensification (SRI) in India. Insights from social innovation theory that emphasize the roles of social movements and the reengagement of vulnerable populations in societal transformation can help reinstate the missing “social” dimension in current discourses on innovation in India. India has a rich and vibrant tradition of social innovation wherein vulnerable communities have engaged in collective experimentation. This is often missed in official or formal accounts. Social innovations such as SRI can help recreate these possibilities for change from outside the mainstream due to newer opportunities that networks present in the twenty-first century. I show how local and international networks led by Civil Society Organizations have reinterpreted and reconstructed game-changing macrotrends in agriculture. This has enabled the articulation and translation of an alternative paradigm for sustainable transitions within agriculture from outside formal research channels. These social innovations, however, encounter stiff opposition from established actors in agricultural research systems. Newer heterogeneous networks, as witnessed in SRI, provide opportunities for researchers within hierarchical research systems to explore, experiment, and create newer norms of engagement with Civil Society Organizations and farmers. I emphasize valuing and embedding diversity of practices and institutions at an early stage to enable systems to be more resilient and adaptable in sustainable transitions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work presented in my thesis addresses the two cornerstones of modern astronomy: Observation and Instrumentation. Part I deals with the observation of two nearby active galaxies, the Seyfert 2 galaxy NGC 1433 and the Seyfert 1 galaxy NGC 1566, both at a distance of $\sim10$ Mpc, which are part of the Nuclei of Galaxies (NUGA) sample. It is well established that every galaxy harbors a super massive black hole (SMBH) at its center. Furthermore, there seems to be a fundamental correlation between the stellar bulge and SMBH masses. Simulations show that massive feedback, e.g., powerful outflows, in Quasi Stellar Objects (QSOs) has an impact on the mutual growth of bulge and SMBH. Nearby galaxies follow this relation but accrete mass at much lower rates. This gives rise to the following questions: Which mechanisms allow feeding of nearby Active Galactic Nuclei (AGN)? Is this feeding triggered by events, e.g., star formation, nuclear spirals, outflows, on $\sim500$ pc scales around the AGN? Does feedback on these scales play a role in quenching the feeding process? Does it have an effect on the star formation close to the nucleus? To answer these questions I have carried out observations with the Spectrograph for INtegral Field Observation in the Near Infrared (SINFONI) at the Very Large Telescope (VLT) situated on Cerro Paranal in Chile. I have reduced and analyzed the recorded data, which contain spatial and spectral information in the H-band ($1.45 \mic-1.85 \mic$) and K-band ($1.95 \mic-2.45 \mic$) on the central $10\arcsec\times10\arcsec$ of the observed galaxies. Additionally, Atacama Large Millimeter/Sub-millimeter Array (ALMA) data at $350$ GHz ($\sim0.87$ mm) as well as optical high resolution Hubble Space Telescope (HST) images are used for the analysis. For NGC 1433 I deduce from comparison of the distributions of gas, dust, and intensity of highly ionized emission lines that the galaxy center lies $\sim70$ pc north-northwest of the prior estimate. A velocity gradient is observed at the new center, which I interpret as a bipolar outflow, a circum nuclear disk, or a combination of both. At least one dust and gas arm leads from a $r\sim200$ pc ring towards the nucleus and might feed the SMBH. Two bright warm H$_2$ gas spots are detected that indicate hidden star formation or a spiral arm-arm interaction. From the stellar velocity dispersion (SVD) I estimate a SMBH mass of $\sim1.74\times10^7$ \msol. For NGC 1566 I observe a nuclear gas disk of $\sim150$ pc in radius with a spiral structure. I estimate the total mass of this disk to be $\sim5.4\times10^7$ \msol. What mechanisms excite the gas in the disk is not clear. Neither can the existence of outflows be proven nor is star formation detected over the whole disk. On one side of the spiral structure I detect a star forming region with an estimated star formation rate of $\sim2.6\times10^{-3}$ \msol\ yr$^{-1}$. From broad Br$\gamma$ emission and SVD I estimate a mean SMBH mass of $\sim5.3\times10^6$ \msol\ with an Eddington ratio of $\sim2\times10^{-3}$. Part II deals with the final tests of the Fringe and Flexure Tracker (FFTS) for LBT INterferometric Camera and the NIR/Visible Adaptive iNterferometer for Astronomy (LINC-NIRVANA) at the Large Binocular Telescope (LBT) in Arizona, USA, which I conducted. The FFTS is the subsystem that combines the two separate beams of the LBT and enables near-infrared interferometry with a significantly large field of view. The FFTS has a cryogenic system and an ambient temperature system which are separated by the baffle system. I redesigned this baffle to guarantee the functionality of the system after the final tests in the Cologne cryostat. The redesign did not affect any scientific performance of LINC-NIRVANA. I show in the final cooldown tests that the baffle fulfills the temperature requirement and stays $<110$ K whereas the moving stages in the ambient system stay $>273$ K, which was not given for the old baffle design. Additionally, I test the tilting flexure of the whole FFTS and show that accurate positioning of the detector and the tracking during observation can be guaranteed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

China's water pollution control law stipulates the Water Pollution Discharge Permit (WPDP) institution and authorizes the State Council to draft the regulations for its implementation and enforcement. However, until today, national regulations have not been established and the permitting system has been operating according to provincial regulations. in contrast to USA, the effluents permit system has been operated for more than 40 years and received relatively successful results. The CWA/NPDES experience offers a valuable reference for China’s water permit system.