907 resultados para Deliberate fire, repetition, crime analysis, intelligence led-policing, forensic intelligence
Resumo:
A utilização da informação no planejamento da atividade policial se coloca como uma das principais questões no debate sobre os paradigmas de segurança pública contemporâneos. Dessa forma, as estatísticas criminais situam-se como importante instrumento neste processo. Este artigo busca promover uma reflexão sobre o uso das estatísticas criminais pelos profissionais de segurança pública como uma fonte de informação para o planejamento de suas ações. Para tal, foram abordadas três variáveis consideradas centrais para o desenvolvimento da discussão: os paradigmas de policiamento contemporâneos, a utilização e as funcionalidades das estatísticas no planejamento da atividade policial, e a utilização das estatísticas criminais por parte dos profissionais de segurança pública tendo em vista o contexto organizacional no qual estão inseridos. A conclusão do artigo aponta para a necessidade de se observarem a cultura e a estrutura das instituições policiais como elementos centrais no desenvolvimento de um modelo de policiamento marcado pela inteligência, pró-atividade e prevenção à criminalidade.
Resumo:
O terrorismo contemporâneo se destaca como um dos mais discutidos tópicos da agenda política internacional. No contexto da globalização, a atuação de grupos extremistas é bem mais complexa e articulada do que jamais foi, e sua periculosidade é exacerbada pela potencial utilização de armas de destruição em massa. Se é certo que a solução para seu enfrentamento pode ser almejada pela cooperação entre as nações, é igualmente verdadeiro que o Direito pode ser um instrumento idôneo para assegurar melhor colaboração e maior efetividade das medidas. Partindo do pressuposto que o terrorismo pode ser compreendido como um instituto do Direito Criminal, apresento esta monografia, a fim de investigar como o Brasil – que notoriamente carece de normatização útil na legislação ordinária – poderia criar uma lei antiterror. A metodologia de pesquisa é primordialmente descritiva, com enfoque na compreensão do terrorismo enquanto instituto jurídico-criminal. Dedicamo-nos a um estudo crítico dos aspectos teóricos e práticos de se criar uma lei antiterror, inclusive estudando projetos de lei em tramitação, a fim de ver como o Legislativo tem abordado a matéria.
Resumo:
O trabalho de conclusão de curso tem como objetivo central a análise crítica da Lei Complementar nº 105 de 2001- que autoriza a Receita Federal do Brasil a quebrar diretamente o sigilo bancário dos contribuintes, com base em possíveis indícios de omissões, fraudes e simulações- como meio hábil para coibir o crime de sonegação fiscal. A partir dessa análise, vamos testar a hipótese de que nenhum agente público pode determinar a quebra das informações bancárias de um contribuinte, sem a prévia autorização do Poder Judiciário. O artigo tem três partes. Na primeira, os principais conceitos que envolvem o sigilo bancário e as possíveis exceções à quebra do sigilo bancário são descritas e discutidas. A partir do exame conceitual, vamos estudar a correlação desse assunto com o combate à sonegação fiscal e a afirmação do princípio da transparência fiscal na comunidade internacional. Na última parte, somos chamados a estudar a opinião da Suprema Corte quanto ao objeto do presente trabalho. A conclusão a que se chega é a de que os agentes públicos não podem obter as referidas informações sem prévia autorização de um juiz. Contudo, a matéria apesar de muito antiga, ainda é polêmica para a doutrina e a jurisprudência. Além disso, a alteração na composição do Supremo Tribunal Federal, de 2010 para 2015, pode indicar uma mudança também no entendimento dos magistrados quanto ao assunto.
Resumo:
Latin America has recently experienced three cycles of capital inflows, the first two ending in major financial crises. The first took place between 1973 and the 1982 ‘debt-crisis’. The second took place between the 1989 ‘Brady bonds’ agreement (and the beginning of the economic reforms and financial liberalisation that followed) and the Argentinian 2001/2002 crisis, and ended up with four major crises (as well as the 1997 one in East Asia) — Mexico (1994), Brazil (1999), and two in Argentina (1995 and 2001/2). Finally, the third inflow-cycle began in 2003 as soon as international financial markets felt reassured by the surprisingly neo-liberal orientation of President Lula’s government; this cycle intensified in 2004 with the beginning of a (purely speculative) commodity price-boom, and actually strengthened after a brief interlude following the 2008 global financial crash — and at the time of writing (mid-2011) this cycle is still unfolding, although already showing considerable signs of distress. The main aim of this paper is to analyse the financial crises resulting from this second cycle (both in LA and in East Asia) from the perspective of Keynesian/ Minskyian/ Kindlebergian financial economics. I will attempt to show that no matter how diversely these newly financially liberalised Developing Countries tried to deal with the absorption problem created by the subsequent surges of inflow (and they did follow different routes), they invariably ended up in a major crisis. As a result (and despite the insistence of mainstream analysis), these financial crises took place mostly due to factors that were intrinsic (or inherent) to the workings of over-liquid and under-regulated financial markets — and as such, they were both fully deserved and fairly predictable. Furthermore, these crises point not just to major market failures, but to a systemic market failure: evidence suggests that these crises were the spontaneous outcome of actions by utility-maximising agents, freely operating in friendly (‘light-touch’) regulated, over-liquid financial markets. That is, these crises are clear examples that financial markets can be driven by buyers who take little notice of underlying values — i.e., by investors who have incentives to interpret information in a biased fashion in a systematic way. Thus, ‘fat tails’ also occurred because under these circumstances there is a high likelihood of self-made disastrous events. In other words, markets are not always right — indeed, in the case of financial markets they can be seriously wrong as a whole. Also, as the recent collapse of ‘MF Global’ indicates, the capacity of ‘utility-maximising’ agents operating in (excessively) ‘friendly-regulated’ and over-liquid financial market to learn from previous mistakes seems rather limited.
Resumo:
Latin America has recently experienced three cycles of capital inflows, the first two ending in major financial crises. The first took place between 1973 and the 1982 ‘debt-crisis’. The second took place between the 1989 ‘Brady bonds’ agreement (and the beginning of the economic reforms and financial liberalisation that followed) and the Argentinian 2001/2002 crisis, and ended up with four major crises (as well as the 1997 one in East Asia) — Mexico (1994), Brazil (1999), and two in Argentina (1995 and 2001/2). Finally, the third inflow-cycle began in 2003 as soon as international financial markets felt reassured by the surprisingly neo-liberal orientation of President Lula’s government; this cycle intensified in 2004 with the beginning of a (purely speculative) commodity price-boom, and actually strengthened after a brief interlude following the 2008 global financial crash — and at the time of writing (mid-2011) this cycle is still unfolding, although already showing considerable signs of distress. The main aim of this paper is to analyse the financial crises resulting from this second cycle (both in LA and in East Asia) from the perspective of Keynesian/ Minskyian/ Kindlebergian financial economics. I will attempt to show that no matter how diversely these newly financially liberalised Developing Countries tried to deal with the absorption problem created by the subsequent surges of inflow (and they did follow different routes), they invariably ended up in a major crisis. As a result (and despite the insistence of mainstream analysis), these financial crises took place mostly due to factors that were intrinsic (or inherent) to the workings of over-liquid and under-regulated financial markets — and as such, they were both fully deserved and fairly predictable. Furthermore, these crises point not just to major market failures, but to a systemic market failure: evidence suggests that these crises were the spontaneous outcome of actions by utility-maximising agents, freely operating in friendly (light-touched) regulated, over-liquid financial markets. That is, these crises are clear examples that financial markets can be driven by buyers who take little notice of underlying values — investors have incentives to interpret information in a biased fashion in a systematic way. ‘Fat tails’ also occurred because under these circumstances there is a high likelihood of self-made disastrous events. In other words, markets are not always right — indeed, in the case of financial markets they can be seriously wrong as a whole. Also, as the recent collapse of ‘MF Global’ indicates, the capacity of ‘utility-maximising’ agents operating in unregulated and over-liquid financial market to learn from previous mistakes seems rather limited.
Resumo:
The right against self-incrimination is a fundamental right that works in the criminal prosecution, and therefore deserves a study supported by the general theory of criminal procedure. The right has a vague origin, and despite the various historical accounts only arises when there is a criminal procedure structured that aims to limit the State´s duty-power to punish. The only system of criminal procedure experienced that reconciles with seal self-incrimination is the accusatory model. The inquisitorial model is based on the construction of a truth and obtaining the confession at any cost, and is therefore incompatible with the right in study. The consecration of the right arises with the importance that fundamental rights have come to occupy in the Democratic Constitutional States. In the Brazilian experience before 1988 was only possible to recognize that self-incrimination represented a procedural burden for accused persons. Despite thorough debate in the Constituent Assembly, the right remains consecrated in a textual formula that´s closer to the implementation made by the Supreme Court of the United States, known as "Miranda warnings", than the text of the Fifth Amendment to the U.S. Constitution that established originally the right against self-incrimination with a constitutional status. However, the imprecise text does not prevent the consecration of the principle as a fundamental right in Brazilian law. The right against self-incrimination is a right that should be observed in the Criminal Procedure and relates to several of his canons, such as the the presumption of not guilty, the accusatory model, the distribution of the burden of proof, and especially the right of defense. Because it a fundamental right, the prohibition of self-incrimination deserves a proper study to her constitutional nature. For the definition of protected persons is important to build a material concept of accused, which is different of the formal concept over who is denounced on the prosecution. In the objective area of protection, there are two objects of protection of the norm: the instinct of self-preservation of the subject and the ability to self-determination. Configuring essentially a evidence rule in criminal procedure, the analysis of the case should be based on standards set previously to indicate respect for the right. These standard include the right to information of the accused, the right to counsel and respect the voluntary participation. The study of violations cases, concentrated on the element of voluntariness, starting from the definition of what is or is not a coercion violative of self-determination. The right faces new challenges that deserve attention, especially the fight against terrorism and organized crime that force the development of tools, resources and technologies about proves, methods increasingly invasive and hidden, and allow the use of information not only for criminal prosecution, but also for the establishment of an intelligence strategy in the development of national and public security
Resumo:
This work of research presents an investigation into the knowledge related to the ostensive policing activities of a group of the Rio Grande do Norte State Military Police Captains. This knowledge, which is decisive and part of Brazilian Military Police Constitutional matters, must be taken into consideration when it comes down to planning and putting into force the services related to ostensive public security. Thus, a historical and social analysis about the formation of the police by starting from foreigner experiences down to Rio Grande do Norte s reality, led by such knowledge, was made. Further, studying Brazilian and local scene, this knowledge was analyzed on the ostensive policing activities as for the principles of the Brazilian National Public Security Plan, Brazilian Classification of Occupations / CBO 2002, the reference documents and studies for police graduation Curricular Basis and Matrix; the Variables of Ostensive Policing, as well as some important competences of police service. Arguing that this knowledge is somehow related to what is presented in this work as Orientation Axis to Military Police Service , research tools such as Critical Case Solution and the answers to the Questionnaire on Fundamental Areas of Military Police Service , having in the end six knowledge models related to ostensive policing activities were used within that group. This knowledge can be classified in three distinct categories of connotations within the military police activity: one with reactive/repressive characteristics being the most predominant; the second as preventive; and another one that revealed that the military police activity is being misused for actions and/or missions outside the scope of action of military police
Resumo:
This research work is focused to show the changes in educational administration from the agreements between the Mossoró / RN and the Ayrton Senna Institute IAS, for education provision. Nowadays, the partnership policy is a constitutive element of the reform of the Brazilian State, which dropped its action on social policies and to strengthen its regulatory role, encouraging private participation in planning, preparation and implementation of public policies, new printing setting the political-social. In this context, the 10 Note Management Programme, developed by the IAS, is part of the neoliberal logic of modernization of public school systems, focusing on results and developing strategies for control and regulation of schools work and its efficiency, effectiveness and greater productivity. The 10 Note focuses on two dimensions: the management of learning and teaching in networking, in a managerial perspective to overcome the culture of failure (expressed as age-grade, dropout and repetition rates in) and implantation of culture of success (as measured in the improvement of the indices). To understanding the process, we have delimited as the object of study, the process of implementing them mentioned program in the city, which its objective is to analyze implications for the school community from the perspective of democratic management, adopting the dimensions of autonomy and participation in institutional processes as a criterion of analysis. From a methodological point of view, the survey was conducted from a literature review and documentary about educational policy developed in the country since the 1990´s, seeking to understand, in a dialectical perspective, the political dimensions of teaching, training and performance of the subjects involved in the school work. Besides the empirical observation, it was also used semi-structured interviews with a methodological tool for gathering information and opinions about the partnership and the implementation of the 10 Note Management Program in the county. The interviewee participants were ex-former education managers, coordinators, school managers, secretaries and teachers. Regarding the dimensions inside the analysis (autonomy and participation), the research led to the conclusion: that GEED, under the guidance of IAS promoted regulation of school autonomy, set up the selection process for exercising the office of school administration and system awards to schools, pupils and teachers, subject to results, there is mismatch between the managerial logic and the democratic management principles, that the ideological discourse of modernization of municipal management coexists with traditional practices, centralizing patronage, which ignores the democratic participation in the school decisions processes, the goals of the partnership were partially achieved, since that the city has improved over the approval and dropouts, although the approval of the Education Municipal Plan of the rules institutional (administrative, financial and educational) and the creation of the Councils observed that the school community participation is still limited, not being characterized as a coordinated intervention, capable of promoting the transformation and improvement its quality in the county. In the same way, the orientation of networking is a limit to the autonomy of schools, given the external definition of goals and strategies to be adopted, along with pressure exerted through the accountability of each school community for their achievements
Resumo:
The aim of this study was the evaluation of the effectiveness of photodynamic therapy on the decontamination of artificially induced carious bovine dentin, using Photoge(R) as the photosensitizer agent and an LED device as a light source. Dentin samples obtained from bovine incisors were immersed in sterile broth supplemented by Lactobacillus acidophillus 10(8) colony formation units (CFU) and Streptococcus mutans 10 8 CFU. Different concentrations of photosensitizer, PA = 1 mg/ml, PB = 2 mg/ml, and PC = 3 mg/ml, and two fluences, D = 24 J/cm(2) and D = 48 J/cm(2), were investigated. After CFU counting per milligram of carious dentin and statistical analysis, we observed that the photodynamic therapy (PDT) parameters used were effective for bacterial reduction in the in vitro model under study. The best result was achieved with the application of Photoge(R) at 2 mg/ml and photoactivated under 24 J/cm(2) showing a survival factor of 0.14. At higher photosensitizer concentrations, a higher dark toxicity was observed. We propose a simple mathematical expression for the determination of PDT parameters of photosensitizer concentration and light fluence for different survival factor values. Since LED devices are simpler and cheaper compared to laser systems, it would be interesting to verify their efficacy as a light source in photodynamic therapy for the decontamination of carious dentin.
Resumo:
Swallowing dynamics involves the coordination and interaction of several muscles and nerves which allow correct food transport from mouth to stomach without laryngotracheal penetration or aspiration. Clinical swallowing assessment depends on the evaluator's knowledge of anatomic structures and of neurophysiological processes involved in swallowing. Any alteration in those steps is denominated oropharyngeal dysphagia, which may have many causes, such as neurological or mechanical disorders. Videofluoroscopy of swallowing is presently considered to be the best exam to objectively assess the dynamics of swallowing, but the exam needs to be conducted under certain restrictions, due to patient's exposure to radiation, which limits periodical repetition for monitoring swallowing therapy. Another method, called cervical auscultation, is a promising new diagnostic tool for the assessment of swallowing disorders. The potential to diagnose dysphagia in a noninvasive manner by assessing the sounds of swallowing is a highly attractive option for the dysphagia clinician. Even so, the captured sound has an amount of noise, which can hamper the evaluator's decision. In that way, the present paper proposes the use of a filter to improve the quality of audible sound and facilitate the perception of examination. The wavelet denoising approach is used to decompose the noisy signal. The signal to noise ratio was evaluated to demonstrate the quantitative results of the proposed methodology. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
This study aims to demonstrate that data from business games can be an important resource for improving efficiency and effectiveness of learning. The proposal presented here was developed from preliminary studies of data from Virtual Market games that pointed the possibility of identifying gaps in learning by analyzing the decisions of students. This proposal helps students to refine their learning processes and equips tutors with strategies for teaching and student assessment. The proposal also complements the group discussion and/or debriefing, which are widely used to enhance learning mediated by games. However, from a management perspective the model has the potential to be erroneous and miss opportunities, which cannot be detected because of the dependence on the characteristics of the individual, such as ability to communicate and work together. To illustrate the proposed technique, data sets from two business games were analyzed with the focus on managing working capital and it was found that students had difficulties managing this task. Similar trends were observed in all categories of students in the study-undergraduate, postgraduate and specialization. This discovery led us to the analysis of data for decisions made in the performance of the games, and it was determined that indicators could be developed that were capable of indentifying inconsistencies in the decisions. It was decided to apply some basic concepts of the finance management, such as management of the operational and non-operational expenditures, as well as production management concepts, such as the use of the production capacity. By analyzing the data from the Virtual Market games using the indicator concept, it was possible to detect the lack of domain knowledge of the students. Therefore, these indicators can be used to analyze the decisions of the players and guide them during the game, increasing their effectiveness and efficiency. As these indicators were developed from specific content, they can also be used to develop teaching materials to support learning. Viewed in this light, the proposal adds new possibilities for using business games in learning. In addition to the intrinsic learning that is achieved through playing the games, they also assist in driving the learning process. This study considers the applications and the methodology used.
Resumo:
Artificial neural networks are dynamic systems consisting of highly interconnected and parallel nonlinear processing elements. Systems based on artificial neural networks have high computational rates due to the use of a massive number of these computational elements. Neural networks with feedback connections provide a computing model capable of solving a rich class of optimization problems. In this paper, a modified Hopfield network is developed for solving problems related to operations research. The internal parameters of the network are obtained using the valid-subspace technique. Simulated examples are presented as an illustration of the proposed approach. Copyright (C) 2000 IFAC.
Resumo:
The advantages offered by the electronic component LED (Light Emitting Diode) have caused a quick and wide application of this device in replacement of incandescent lights. However, in its combined application, the relationship between the design variables and the desired effect or result is very complex and it becomes difficult to model by conventional techniques. This work consists of the development of a technique, through comparative analysis of neuro-fuzzy architectures, to make possible to obtain the luminous intensity values of brake lights using LEDs from design data.
Resumo:
This paper proposes the application of computational intelligence techniques to assist complex problems concerning lightning in transformers. In order to estimate the currents related to lightning in a transformer, a neural tool is presented. ATP has generated the training vectors. The input variables used in Artificial Neural Networks (ANN) were the wave front time, the wave tail time, the voltage variation rate and the output variable is the maximum current in the secondary of the transformer. These parameters can define the behavior and severity of lightning. Based on these concepts and from the results obtained, it can be verified that the overvoltages at the secondary of transformer are also affected by the discharge waveform in a similar way to the primary side. By using the tool developed, the high voltage process in the distribution transformers can be mapped and estimated with more precision aiding the transformer project process, minimizing empirics and evaluation errors, and contributing to minimize the failure rate of transformers. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)