770 resultados para harm-minimization
Resumo:
Este estudo buscou investigar, a partir de entrevistas voltadas aos professores e diretores pedagógicos de uma escola de educação básica da rede privada da cidade de Aracaju – Sergipe – Brasil, se eles se aperceberam das mudanças ocorridas no mundo, no século XXI, no qual tivemos como variáveis a Globalização, a Tecnologia, Meio Ambiente, Informação e Mudanças na Educação. Tomando conhecimento dessas alterações, procurou-se analisar quais práticas foram adotadas para responder às novas demandas. Questionou-se sobre o que os atores educacionais estão a fazer para preparar o cidadão para o enfrentamento e adaptação a essas mudanças, ou seja, a aplicabilidade dos Quatro Pilares da Educação para o século XXI, tendo como reverência o Relatório Delors (1999). As entrevistas foram divididas em dois momentos: o primeiro considerou as cinco variáveis que caracterizam as mudanças ocorridas no mundo e que fomentaram questões sobre cada uma delas; o segundo, baseado nos pilares educacionais, levantou interrogações quanto à vivência deles na prática escolar. Ao analisar as entrevistas constatou-se que os entrevistados se aperceberam das mudanças no mundo: as fragilidades dos relacionamentos, o aumento da desigualdade social, os benefícios e malefícios das tecnologias, a discreta mudança de postura das empresas e Estado frente ao meio ambiente, o excesso de informações que temos recebido e a falta de capacidade de gerenciá-las e as políticas educativas. A essas mudanças, a Escola reage, considerando todos os saberes necessários para a Educação do século XXI, ou seja, os pilares propostos pela Unesco. A postura dessa Escola, perante as demandas, concretiza o discurso e colabora para a dissipação dessas práticas para outras escolas, o que responde de forma positiva à problemática levantada.
Resumo:
O trabalho apresentado é decorrente do Projeto de Intervenção realizado no âmbito do Curso de 2º Ciclo em Educação Especial – domínio cognitivo e motor, da Universidade Lusófona de Humanidades e Tecnologias. Resumo A referida intervenção contempla a minimização das dificuldades apresentadas por uma menina a nível da leitura e escrita e da sua socialização, numa perspetiva inclusiva. M. é o nome fictício da aluna alvo da intervenção. Atualmente, frequenta o 3º ano de escolaridade numa escola pública, em Lisboa área da sua residência. A revisão da literatura vai sustentar e facilitar a compreensão clara e concisa da intervenção realizada e das posições defendidas sobre esta matéria. Deste modo, são tratados temas no âmbito da exclusão social e escolar, da escola Inclusiva e dos obstáculos que ainda encontramos nas escolas, dos preconceitos, dos alunos com necessidades educativas especiais, das adaptações curriculares, da aprendizagem cooperativa e diferenciação pedagógica, referimo-nos ainda às dificuldades de aprendizagem e, por último, à comunicação e à linguagem oral e escrita. Para obter informações sobre a M. e sobre o contexto da intervenção, bem como sobre todo o seu processo de inclusão escolar, utilizamos como suporte metodológico, a pesquisa documental, as entrevistas semi-diretivas à professora titular de turma e à professora de ensino especial, a observação naturalista, a sociometria e as notas de campo para se poder complementar as informações. A planificação global da intervenção foi elaborada a partir do relacionamento/ cruzamento dos dados que resultaram da análise da informação recolhida. Para uma intervenção fundamentada caracterizamos inicialmente o seu contexto escolar e familiar e posteriormente a M. Os princípios orientadores da intervenção realizada, assentam numa perspetiva de investigação para a ação, e tiveram presentes os objetivos definidos para a M. As atividades foram realizadas, numa perspetiva de aprendizagem muito estruturada, muito refletida e avaliada durante todo o processo, implicando todos os intervenientes. Esta intervenção, levou-nos a estimular práticas educativas, diferenciadas e inclusivas na turma, com a professora titular dessa turma e com a professora do ensino especial com os colegas da M.
Resumo:
Private governance is currently being evoked as a viable solution to many public policy goals. However, in some circumstances it has shown to produce more harm than good, and even disastrous consequences as in the case of the financial crisis that is raging in most advanced economies. Although the current track record of private regulatory schemes is mixed, policy guidance documents around the world still require that policy-makers give priority to self- and co-regulation, with little or no additional guidance being given to policymakers to devise when, and under what circumstances, these solutions can prove viable from a public policy perspective. With an array of examples from several policy fields, this paper approaches regulation as a public-private collaborative form and attempts to identify possible policy tools to be applied by public policy-makers to efficiently and effectively approach private governance as a solution, rather than a problem. We propose a six-step theoretical framework and argue that IA techniques should: i) define an integrated framework including both the possibility that private regulation can be used as an alternative or as a complement to public legislation; ii) involve private parties in public IAs in order to define the best strategy or strategies that would ensure achievement of the regulatory objectives; and iii) contemplate the deployment of indicators related to governance and activities of the regulators and their ability to coordinate and solve disputes with other regulators.
Resumo:
An algorithm is presented for the generation of molecular models of defective graphene fragments, containing a majority of 6-membered rings with a small number of 5- and 7-membered rings as defects. The structures are generated from an initial random array of points in 2D space, which are then subject to Delaunay triangulation. The dual of the triangulation forms a Voronoi tessellation of polygons with a range of ring sizes. An iterative cycle of refinement, involving deletion and addition of points followed by further triangulation, is performed until the user-defined criteria for the number of defects are met. The array of points and connectivities are then converted to a molecular structure and subject to geometry optimization using a standard molecular modeling package to generate final atomic coordinates. On the basis of molecular mechanics with minimization, this automated method can generate structures, which conform to user-supplied criteria and avoid the potential bias associated with the manual building of structures. One application of the algorithm is the generation of structures for the evaluation of the reactivity of different defect sites. Ab initio electronic structure calculations on a representative structure indicate preferential fluorination close to 5-ring defects.
Resumo:
Background Pharmacy aseptic units prepare and supply injectables to minimise risks. The UK National Aseptic Error Reporting Scheme has been collecting data on pharmacy compounding errors, including near-misses, since 2003. Objectives The cumulative reports from January 2004 to December 2007, inclusive, were analysed. Methods The different variables of product types, error types, staff making and detecting errors, stage errors detected, perceived contributory factors, and potential or actual outcomes were presented by cross-tabulation of data. Results A total of 4691 reports were submitted against an estimated 958 532 items made, returning 0.49% as the overall error rate. Most of the errors were detected before reaching patients, with only 24 detected during or after administration. The highest number of reports related to adult cytotoxic preparations (40%) and the most frequently recorded error was a labelling error (34.2%). Errors were mostly detected at first check in assembly area (46.6%). Individual staff error contributed most (78.1%) to overall errors, while errors with paediatric parenteral nutrition appeared to be blamed on low staff levels more than other products were. The majority of errors (68.6%) had no potential patient outcomes attached, while it appeared that paediatric cytotoxic products and paediatric parenteral nutrition were associated with greater levels of perceived patient harm. Conclusions The majority of reports were related to near-misses, and this study highlights scope for examining current arrangements for checking and releasing products, certainly for paediatric cytotoxic and paediatric parenteral nutrition preparations within aseptic units, but in the context of resource and capacity constraints.
Resumo:
The frequency responses of two 50 Hz and one 400 Hz induction machines have been measured experimentally over a frequency range of 1 kHz to 400 kHz. This study has shown that the stator impedances of the machines behave in a similar manner to a parallel resonant circuit, and hence have a resonant point at which the Input impedance of the machine is at a maximum. This maximum impedance point was found experimentally to be as low as 33 kHz, which is well within the switching frequency ranges of modern inverter drives. This paper investigates the possibility of exploiting the maximum impedance point of the machine, by taking it into consideration when designing an inverter, in order to minimize ripple currents due to the switching frequency. Minimization of the ripple currents would reduce torque pulsation and losses, increasing overall performance. A modified machine model was developed to take into account the resonant point, and this model was then simulated with an inverter to demonstrate the possible advantages of matching the inverter switching frequency to the resonant point. Finally, in order to experimentally verify the simulated results, a real inverter with a variable switching frequency was used to drive an induction machine. Experimental results are presented.
Resumo:
Quasi-Newton-Raphson minimization and conjugate gradient minimization have been used to solve the crystal structures of famotidine form B and capsaicin from X-ray powder diffraction data and characterize the chi(2) agreement surfaces. One million quasi-Newton-Raphson minimizations found the famotidine global minimum with a frequency of ca 1 in 5000 and the capsaicin global minimum with a frequency of ca 1 in 10 000. These results, which are corroborated by conjugate gradient minimization, demonstrate the existence of numerous pathways from some of the highest points on these chi(2) agreement surfaces to the respective global minima, which are passable using only downhill moves. This important observation has significant ramifications for the development of improved structure determination algorithms.
Resumo:
Fitness of hybrids between genetically modified (GM) crops and wild relatives influences the likelihood of ecological harm. We measured fitness components in spontaneous (non-GM) rapeseed x Brassica rapa hybrids in natural populations. The F-1 hybrids yielded 46.9% seed output of B. rapa, were 16.9% as effective as males on B. rapa and exhibited increased self-pollination. Assuming 100% GM rapeseed cultivation, we conservatively predict < 7000 second-generation transgenic hybrids annually in the United Kingdom (i.e. similar to 20% of F-1 hybrids). Conversely, whilst reduced hybrid fitness improves feasibility of bio-containment, stage projection matrices suggests broad scope for some transgenes to offset this effect by enhancing fitness.
Resumo:
Research on the environmental risks of gene flow from genetically modified ( GM) crops to wild relatives has traditionally emphasized recipients yielding most hybrids. For GM rapeseed ( Brassica napus), interest has centred on the 'frequently hybridizing' Brassica rapa over relatives such as Brassica oleracea, where spontaneous hybrids are unreported in the wild. In two sites, where rapeseed and wild B. oleracea grow together, we used flow cytometry and crop-specific microsatellite markers to identify one triploid F-1 hybrid, together with nine diploid and two near triploid introgressants. Given the newly discovered capacity for spontaneous introgression into B. oleracea, we then surveyed associated flora and fauna to evaluate the capacity of both recipients to harm cohabitant species with acknowledged conservational importance. Only B. oleracea occupies rich communities containing species afforded legislative protection; these include one rare micromoth species that feeds on B. oleracea and warrants further assessment. We conclude that increased attention should now focus on B. oleracea and similar species that yield few crop-hybrids, but possess scope to affect rare or endangered associates.
Resumo:
Under low latitude conditions, minimization of solar radiation within the urban environment may often be a desirable criterion in urban design. The dominance of the direct component of the global solar irradiance under clear high sun conditions requires that the street solar access must be small. It is well known that the size and proportion of open spaces has a great influence on the urban microclimate This paper is directed towards finding the interaction between urban canyon geometry and incident solar radiation. The effect of building height and street width on the shading of the street surfaces and ground for different orientations have been examined and evaluated. It is aimed to explore the extent to which these parameters affect the temperature in the street. This work is based on air and surface temperature measurements taken in different urban street canyons in EL-Oued City (hot and and climate), Algeria. In general, the results show that there are less air temperature variations compared to the surface temperature which really depends on the street geometry and sky view factor. In other words, there is a big correlation between the street geometry, sky view factor and surface temperatures.
Resumo:
Objective: To determine whether the use of verbal descriptors suggested by the European Union (EU) such as "common" (1-10% frequency) and "rare" (0.01-0.1%) effectively conveys the level of risk of side effects to people taking a medicine. Design: Randomised controlled study with unconcealed allocation. Participants: 120 adults taking simvastatin or atorvastatin after cardiac surgery or myocardial infarction. Setting: Cardiac rehabilitation clinics at two hospitals in Leeds, UK. Intervention: A written statement about one of the side effects of the medicine (either constipation or pancreatitis). Within each side effect condition half the patients were given the information in verbal form and half in numerical form (for constipation, "common" or 2.5%; for pancreatitis, "rare" or 0.04%). Main outcome measure: The estimated likelihood of the side effect occurring. Other outcome measures related to the perceived severity of the side effect, its risk to health, and its effect on decisions about whether to take the medicine. Results: The mean likelihood estimate given for the constipation side effect was 34.2% in the verbal group and 8.1% in the numerical group; for pancreatitis it was 18% in the verbal group and 2.1% in the numerical group. The verbal descriptors were associated with more negative perceptions of the medicine than their equivalent numerical descriptors. Conclusions: Patients want and need understandable information about medicines and their risks and benefits. This is essential if they are to become partners in medicine taking. The use of verbal descriptors to improve the level of information about side effect risk leads to overestimation of the level of harm and may lead patients to make inappropriate decisions about whether or not they take the medicine.
Resumo:
The identification of criminal networks is not a routine exploratory process within the current practice of the law enforcement authorities; rather it is triggered by specific evidence of criminal activity being investigated. A network is identified when a criminal comes to notice and any associates who could also be potentially implicated would need to be identified if only to be eliminated from the enquiries as suspects or witnesses as well as to prevent and/or detect crime. However, an identified network may not be the one causing most harm in a given area.. This paper identifies a methodology to identify all of the criminal networks that are present within a Law Enforcement Area, and, prioritises those that are causing most harm to the community. Each crime is allocated a score based on its crime type and how recently the crime was committed; the network score, which can be used as decision support to help prioritise it for law enforcement purposes, is the sum of the individual crime scores.
Resumo:
A fundamental principle in practical nonlinear data modeling is the parsimonious principle of constructing the minimal model that explains the training data well. Leave-one-out (LOO) cross validation is often used to estimate generalization errors by choosing amongst different network architectures (M. Stone, "Cross validatory choice and assessment of statistical predictions", J. R. Stast. Soc., Ser. B, 36, pp. 117-147, 1974). Based upon the minimization of LOO criteria of either the mean squares of LOO errors or the LOO misclassification rate respectively, we present two backward elimination algorithms as model post-processing procedures for regression and classification problems. The proposed backward elimination procedures exploit an orthogonalization procedure to enable the orthogonality between the subspace as spanned by the pruned model and the deleted regressor. Subsequently, it is shown that the LOO criteria used in both algorithms can be calculated via some analytic recursive formula, as derived in this contribution, without actually splitting the estimation data set so as to reduce computational expense. Compared to most other model construction methods, the proposed algorithms are advantageous in several aspects; (i) There are no tuning parameters to be optimized through an extra validation data set; (ii) The procedure is fully automatic without an additional stopping criteria; and (iii) The model structure selection is directly based on model generalization performance. The illustrative examples on regression and classification are used to demonstrate that the proposed algorithms are viable post-processing methods to prune a model to gain extra sparsity and improved generalization.
Nonspherical assemblies generated from polystyrene-b-poly(L-lysine) polyelectrolyte block copolymers
Resumo:
This report describes the aqueous solution self-assembly of a series of polystyrene(m)-b-poly(L-lysine)n block copolymers (m = 8-10; n = 10-70). The polymers are prepared by ring-opening polymerization of epsilon-benzyloxycarbonyl-L-lysine N-carboxyanhydride using amine terminated polystyrene macroinitiators, followed by removal of the benzyloxycarbonyl side chain protecting groups. The critical micelle concentration of the block copolymers determined using the pyrene probe technique shows a parabolic dependence on peptide block length exhibiting a maximum at n = approximately 20 (m = 8) or n = approximately 60 (m = 10). The shape and size of the aggregates has been studied by dynamic and static light scattering, small-angle neutron scattering (SANS), and analytical ultracentrifugation (AUC). Surprisingly, Holtzer and Kratky analysis of the static light scattering results indicates the presence of nonspherical, presumably cylindrical objects independent of the poly(L-lysine)n block length. This is supported by SANS data, which can be fitted well by assuming cylindrical scattering objects. AUC analysis allows the molecular weight of the aggregates to be estimated as several million g/mol, corresponding to aggregation numbers of several 10s to 100s. These aggregation numbers agree with those that can be estimated from the length and diameter of the cylinders obtained from the scattering results.