978 resultados para Must -- Analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Canada freedom of information must be viewed in the context of governing -- how do you deal with an abundance of information while balancing a diversity of competing interests? How can you ensure people are informed enough to participate in crucial decision-making, yet willing enough to let some administrative matters be dealt with in camera without their involvement in every detail. In an age when taxpayers' coalition groups are on the rise, and the government is encouraging the establishment of Parent Council groups for schools, the issues and challenges presented by access to information and protection of privacy legislation are real ones. The province of Ontario's decision to extend freedom of information legislation to local governments does not ensure, or equate to, full public disclosure of all facts or necessarily guarantee complete public comprehension of an issue. The mere fact that local governments, like school boards, decide to collect, assemble or record some information and not to collect other information implies that a prior decision was made by "someone" on what was important to record or keep. That in itself means that not all the facts are going to be disclosed, regardless of the presence of legislation. The resulting lack of information can lead to public mistrust and lack of confidence in those who govern. This is completely contrary to the spirit of the legislation which was to provide interested members of the community with facts so that values like political accountability and trust could be ensured and meaningful criticism and input obtained on matters affecting the whole community. This thesis first reviews the historical reasons for adopting freedom of information legislation, reasons which are rooted in our parliamentary system of government. However, the same reasoning for enacting such legislation cannot be applied carte blanche to the municipal level of government in Ontario, or - ii - more specifially to the programs, policies or operations of a school board. The purpose of this thesis is to examine whether the Municipal Freedom of Information and Protection of Privacy Act, 1989 (MFIPPA) was a neccessary step to ensure greater openness from school boards. Based on a review of the Orders made by the Office of the Information and Privacy Commissioner/Ontario, it also assesses how successfully freedom of information legislation has been implemented at the municipal level of government. The Orders provide an opportunity to review what problems school boards have encountered, and what guidance the Commissioner has offered. Reference is made to a value framework as an administrative tool in critically analyzing the suitability of MFIPPA to school boards. The conclusion is drawn that MFIPPA appears to have inhibited rather than facilitated openness in local government. This may be attributed to several factors inclusive of the general uncertainty, confusion and discretion in interpreting various provisions and exemptions in the Act. Some of the uncertainty is due to the fact that an insufficient number of school board staff are familiar with the Act. The complexity of the Act and its legalistic procedures have over-formalized the processes of exchanging information. In addition there appears to be a concern among municipal officials that granting any access to information may be violating personal privacy rights of others. These concerns translate into indecision and extreme caution in responding to inquiries. The result is delay in responding to information requests and lack of uniformity in the responses given. However, the mandatory review of the legislation does afford an opportunity to address some of these problems and to make this complex Act more suitable for application to school boards. In order for the Act to function more efficiently and effectively legislative changes must be made to MFIPPA. It is important that the recommendations for improving the Act be adopted before the government extends this legislation to any other public entities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Individuals with intellectual disabilities (ID) as a group have been subject to abuse. Individuals with ID need to be made aware of their rights. The 3Rs: Rights, Respect and Responsibility Human Rights Project is promoting rights awareness in individuals with ID, their caregivers and family members. To be effeCtive, abuse prevention must include support from the whole organization and its processes. This research evaluated the impact of the 3Rs initiative on the organization. It focused particularly on descriptions of organizational change perceived by full-time staff and managers in response to the initiation of the 3Rs Project. Behavioural interviews were conducted and a thematic analysis was used to describe changes in the organizational culture and behavioural mechanisms maintaining these changes. Systemic barriers to change were also explored. The results indicate that the Association is effectively implementing and supporting the rights-based philosophy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L'introduction de nouvelles biotechnologies dans tout système de soins de santé est un processus complexe qui est étroitement lié aux facteurs économiques, politiques et culturels, et, par conséquent, demande de remettre en cause plusieurs questions sociales et éthiques. Dans la situation particulière de l’Argentine - c’est-à-dire: de grandes inégalités sociales entre les citoyens, la rareté des ressources sanitaires, l’accès limité aux services de base, l’absence de politiques spécifiques - l'introduction de technologies génétiques pose de sérieux défis qui doivent impérativement être abordés par les décideurs politiques. Ce projet examine le cas des tests génétiques prénataux dans le contexte du système de santé argentin pour illustrer comment leur introduction peut être complexe dans une nation où l’accès égale aux services de santé doit encore être amélioré. Il faut également examiner les restrictions légales et les préceptes religieux qui influencent l'utilisation des technologies génétiques, ce qui souligne la nécessite de développer un cadre de référence intégral pour le processus d'évaluation des technologies afin d’appuyer l’élaboration de recommandations pour des politiques cohérentes et novatrices applicables au contexte particulier de l’Argentine.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose to show in this paper, that the time series obtained from biological systems such as human brain are invariably nonstationary because of different time scales involved in the dynamical process. This makes the invariant parameters time dependent. We made a global analysis of the EEG data obtained from the eight locations on the skull space and studied simultaneously the dynamical characteristics from various parts of the brain. We have proved that the dynamical parameters are sensitive to the time scales and hence in the study of brain one must identify all relevant time scales involved in the process to get an insight in the working of brain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The practical applications of microstrip antennas for mobile systems are in portable or pocket-size equipment and in vehicles. Antennas for VHFIUHF handheld portable equipment, such as pagers, portable telephones and transceivers, must naturally be small in size, light in weight and compact in structure. There is a growing tendency for portable equipment to be made smaller and smaller as the demand for personal communication rapidly increases, and the development of very compact hand-held units has become urgent.In this thesis work, main aim is to develop a more and more reduced sized microstrip patch antenna. It is well known that the smaller the antenna size, the lower the antenna efficiency. During the period of work, three different compact circular sided microstrip patches are developed and analysed, which have a significant size reduction compared to standard circular disk antenna (the most compact one of the basic microstrip patch configurations), without much deterioration of its properties like gain, bandwidth and efficiency. In addition to this the interesting results, dual port operation and circular polarization are also observed for some typical designs of these patches. These make the patches suitable for satellite and mobile communication systems.The theoretical investigations are carried out on these compact patches. The empirical relations are developed by modifying the standard equations of rectangular and circular disk microstrip patches, which helps to predict the resonant frequencies easily.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work identifies the importance of plenum pressure on the performance of the data centre. The present methodology followed in the industry considers the pressure drop across the tile as a dependant variable, but it is shown in this work that this is the only one independent variable that is responsible for the entire flow dynamics in the data centre, and any design or assessment procedure must consider the pressure difference across the tile as the primary independent variable. This concept is further explained by the studies on the effect of dampers on the flow characteristics. The dampers have found to introduce an additional pressure drop there by reducing the effective pressure drop across the tile. The effect of damper is to change the flow both in quantitative and qualitative aspects. But the effect of damper on the flow in the quantitative aspect is only considered while using the damper as an aid for capacity control. Results from the present study suggest that the use of dampers must be avoided in data centre and well designed tiles which give required flow rates must be used in the appropriate locations. In the present study the effect of hot air recirculation is studied with suitable assumptions. It identifies that, the pressure drop across the tile is a dominant parameter which governs the recirculation. The rack suction pressure of the hardware along with the pressure drop across the tile determines the point of recirculation in the cold aisle. The positioning of hardware in the racks play an important role in controlling the recirculation point. The present study is thus helpful in the design of data centre air flow, based on the theory of jets. The air flow can be modelled both quantitatively and qualitatively based on the results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Near-infrared spectroscopy can be a workhorse technique for materials analysis in industries such as agriculture, pharmaceuticals, chemicals and polymers. A near-infrared spectrum represents combination bands and overtone bands that are harmonics of absorption frequencies in the mid-infrared. Near-infrared absorption includes a combination-band region immediately adjacent to the mid-infrared and three overtone regions. All four near-infrared regions contain "echoes" of the fundamental mid-infrared absorptions. For example, vibrations in the mid-infrared due to the C-H stretches will produce four distinct bands in each of the overtone and combination regions. As the bands become more removed from the fundamental frequencies they become more widely separated from their neighbors, more broadened and are dramatically reduced in intensity. Because near-infrared bands are much less intense, more of the sample can be used to produce a spectra and with near-infrared, sample preparation activities are greatly reduced or eliminated so more of the sample can be utilized. In addition, long path lengths and the ability to sample through glass in the near-infrared allows samples to be measured in common media such as culture tubes, cuvettes and reaction bottles. This is unlike mid-infrared where very small amounts of a sample produce a strong spectrum; thus sample preparation techniques must be employed to limit the amount of the sample that interacts with the beam. In the present work we describe the successful the fabrication and calibration of a linear high resolution linear spectrometer using tunable diode laser and a 36 m path length cell and meuurement of a highly resolved structure of OH group in methanol in the transition region A v =3. We then analyse the NIR spectrum of certain aromatic molecules and study the substituent effects using local mode theory

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Kochi, the commercial capital of Kerala, South India and second most important city next to Mumbai on the Western coast is a land having a wide variety of residential environments. Due to rapid population growth, changing lifestyles, food habits and living standards, institutional weaknesses, improper choice of technology and public apathy, the present pattern of the city can be classified as that of haphazard growth with typical problems characteristics of unplanned urban development especially in the case of solid waste management. To have a better living condition for us and our future generations, we must know where we are now and how far we need to go. We, each individual must calculate how much nature we use and compare it to how much nature we have available. This can be achieved by applying the concept of ecological footprint. Ecological footprint analysis (EFA) is a quantitative tool that represents the ecological load imposed on earth by humans in spatial terms. The aim of applying EFA to Kochi city is to quantify the consumption and waste generation of a population and to compare it with the existing biocapacity. By quantifying the ecological footprint we can formulate strategies to reduce the footprint and there by having a sustainable living. The paper discusses the various footprint components of Kochi city and in detail analyses the waste footprint of the residential areas using waste footprint analyzer. An attempt is also made to suggest some waste foot print reduction strategies thereby making the city sustainable as far as solid waste management is concerned.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Auf dem Gebiet der Strukturdynamik sind computergestützte Modellvalidierungstechniken inzwischen weit verbreitet. Dabei werden experimentelle Modaldaten, um ein numerisches Modell für weitere Analysen zu korrigieren. Gleichwohl repräsentiert das validierte Modell nur das dynamische Verhalten der getesteten Struktur. In der Realität gibt es wiederum viele Faktoren, die zwangsläufig zu variierenden Ergebnissen von Modaltests führen werden: Sich verändernde Umgebungsbedingungen während eines Tests, leicht unterschiedliche Testaufbauten, ein Test an einer nominell gleichen aber anderen Struktur (z.B. aus der Serienfertigung), etc. Damit eine stochastische Simulation durchgeführt werden kann, muss eine Reihe von Annahmen für die verwendeten Zufallsvariablengetroffen werden. Folglich bedarf es einer inversen Methode, die es ermöglicht ein stochastisches Modell aus experimentellen Modaldaten zu identifizieren. Die Arbeit beschreibt die Entwicklung eines parameter-basierten Ansatzes, um stochastische Simulationsmodelle auf dem Gebiet der Strukturdynamik zu identifizieren. Die entwickelte Methode beruht auf Sensitivitäten erster Ordnung, mit denen Parametermittelwerte und Kovarianzen des numerischen Modells aus stochastischen experimentellen Modaldaten bestimmt werden können.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hydrogeological research usually includes some statistical studies devised to elucidate mean background state, characterise relationships among different hydrochemical parameters, and show the influence of human activities. These goals are achieved either by means of a statistical approach or by mixing models between end-members. Compositional data analysis has proved to be effective with the first approach, but there is no commonly accepted solution to the end-member problem in a compositional framework. We present here a possible solution based on factor analysis of compositions illustrated with a case study. We find two factors on the compositional bi-plot fitting two non-centered orthogonal axes to the most representative variables. Each one of these axes defines a subcomposition, grouping those variables that lay nearest to it. With each subcomposition a log-contrast is computed and rewritten as an equilibrium equation. These two factors can be interpreted as the isometric log-ratio coordinates (ilr) of three hidden components, that can be plotted in a ternary diagram. These hidden components might be interpreted as end-members. We have analysed 14 molarities in 31 sampling stations all along the Llobregat River and its tributaries, with a monthly measure during two years. We have obtained a bi-plot with a 57% of explained total variance, from which we have extracted two factors: factor G, reflecting geological background enhanced by potash mining; and factor A, essentially controlled by urban and/or farming wastewater. Graphical representation of these two factors allows us to identify three extreme samples, corresponding to pristine waters, potash mining influence and urban sewage influence. To confirm this, we have available analysis of diffused and widespread point sources identified in the area: springs, potash mining lixiviates, sewage, and fertilisers. Each one of these sources shows a clear link with one of the extreme samples, except fertilisers due to the heterogeneity of their composition. This approach is a useful tool to distinguish end-members, and characterise them, an issue generally difficult to solve. It is worth note that the end-member composition cannot be fully estimated but only characterised through log-ratio relationships among components. Moreover, the influence of each endmember in a given sample must be evaluated in relative terms of the other samples. These limitations are intrinsic to the relative nature of compositional data

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Usually, psychometricians apply classical factorial analysis to evaluate construct validity of order rank scales. Nevertheless, these scales have particular characteristics that must be taken into account: total scores and rank are highly relevant

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A compositional time series is obtained when a compositional data vector is observed at different points in time. Inherently, then, a compositional time series is a multivariate time series with important constraints on the variables observed at any instance in time. Although this type of data frequently occurs in situations of real practical interest, a trawl through the statistical literature reveals that research in the field is very much in its infancy and that many theoretical and empirical issues still remain to be addressed. Any appropriate statistical methodology for the analysis of compositional time series must take into account the constraints which are not allowed for by the usual statistical techniques available for analysing multivariate time series. One general approach to analyzing compositional time series consists in the application of an initial transform to break the positive and unit sum constraints, followed by the analysis of the transformed time series using multivariate ARIMA models. In this paper we discuss the use of the additive log-ratio, centred log-ratio and isometric log-ratio transforms. We also present results from an empirical study designed to explore how the selection of the initial transform affects subsequent multivariate ARIMA modelling as well as the quality of the forecasts

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En la literatura sobre mecànica quàntica és freqüent trobar descriptors basats en la densitat de parells o la densitat electrònica, amb un èxit divers segons les aplicacions que atenyin. Per tal de que tingui sentit químic un descriptor ha de donar la definició d'un àtom en una molècula, o ésser capaç d'identificar regions de l'espai molecular associades amb algun concepte químic (com pot ser un parell solitari o zona d'enllaç, entre d'altres). En aquesta línia, s'han proposat diversos esquemes de partició: la teoria d'àtoms en molècules (AIM), la funció de localització electrònica (ELF), les cel·les de Voroni, els àtoms de Hirshfeld, els àtoms difusos, etc. L'objectiu d'aquesta tesi és explorar descriptors de la densitat basats en particions de l'espai molecular del tipus AIM, ELF o àtoms difusos, analitzar els descriptors existents amb diferents nivells de teoria, proposar nous descriptors d'aromaticitat, així com estudiar l'habilitat de totes aquestes eines per discernir entre diferents mecanismes de reacció.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.