1000 resultados para Teoria Geral de Sistemas
Resumo:
o trabalho objetiva enquadrar o Programa Grande Carajás dentro dos planos de desenvolvimento da Amazônia ~ do qual ele é um programa para a Amazônia Oriental. A base teórica e a de delimitação de sistemas sociais de Guerreiro Ramos
Resumo:
o rápido movimento dos sistemas de informação baseados na Internet tem contribuído para o surgimento de várias transformações na área de assistência médica, dentro dos hospitais. Entretanto, esse desenvolvimento tem gerado preocupações na mente de acadêmicos e gestores sobre a real capacidade dessas instituições de, satisfatoriamente, gerenciarem sua introdução. Diversas instituições de saúde têm consumido grandes quantias de investimento e frustrado inúmeras pessoas em desperdiçados esforços de implantar sistemas clínicos. Infelizmente, não existem respostas fáceis sobre o porquê de tantos projetos não serem bem sucedidos. A presente pesquisa de caso investigará, empiricamente, as principais causas para a implantação sem sucesso de um sistema clínico de Prescrição Eletrônica no Hospital Adventista Silvestre. Para tal, foi utilizado como arcabouço teórico a Teoria da Resistência à Implantação de Sistemas, notadamente exposta por Kling (1980) e Markus (1983) - que discutem a necessidade de uma análise abrangente do contexto intraorganizacional, para que se revelem os principais fatores responsáveis pelas implantações mal-sucedidas. A análise interpretativa desenvolvida a partir dos três vetores de resistência advindos desta teoria adicionará novas perspectivas e esclarecimentos ao existente corpo de conhecimento no campo da implantação de Tecnologias da Informação e Informática Médica.
Resumo:
O item não apresenta o texto completo, para aquisição do livro na íntegra você poderá acessar a Editora da UFSCar por meio do link: www.editora.ufscar.br
Resumo:
Brazil since its first republican constitution has adopted systems of laws control. The review activity was given to three state powers or functions state, Executive, Legislative and Judiciary. However, it appears that in the country along the constitutional history, has stood considerably the jurisdictional control of the most important control. Initially, back in 1891, Brazil adopted the jurisdictional control of diffuse from U.S, whose role in monitoring of standards is delivered to all organs of the judiciary, which may face a case, put on trial, ascertain whether or not the possibility of applying a law, removing its impact in case of unconstitutionality. In 1969, entered in the second legal model of judicial review, the concentrated control of constitutionality, whose inspiration comes from the positivist theory of Hans Kelsen, and was adopted by the Austrian Constitution of 1920. According to the abstract control the supervision of law is given to a Court or Constitutional Court, responsible for the analysis of the legal constitutionality independent of its application to a specific case. In Brazil the role of concentrated control was handed over exclusively to the Supreme Court, which serves as the Constitutional Court, which accumulates that function with other constitutionally provided jurisdiction. Throughout this period, from 1891 until today, Brazil has maintained a dual system of judicial control of legal constitutionality, where they coexist and harmonize the diffuse control exercised by any organ of the Judiciary, and concentrated control of competence the Supreme Court. However, one must recognize that with the advent of the Federal Constitution of 1988, the concentrated control has emerged on the national stage due to two important factors: the expansion of the legal capacity to sue and the inclusion of other ways control, besides the already known Direct Claim of Unconstitutionality. This concentrated control and projection of the Supreme Court s attempt to become a true constitutional court, led to a relative weakening of diffuse control even when performed by the Brazilian Constitutional Court. In order to become a true constitutional court, all decisions handed down by the Supreme in the exercise of constitutional jurisdiction should have the same weight and the same measure, or at least bring improvement to similar effects, once is the responsible for the final word when it comes to constitutional interpretation. Thus, the writs of certiorari and stare decisis were responsible for profound changes in the diffuse control, allowing the Supreme Court can strengthen its decisions even in the diffuse control. These two institutions have substantially changed the legal status of diffuse control, allowing an extension of the effects of decisions handed down by the Supreme Court, so that you can no longer be said that the effects of this control to restrict the disputing parties in the process
Resumo:
Currently, one of the biggest challenges for the field of data mining is to perform cluster analysis on complex data. Several techniques have been proposed but, in general, they can only achieve good results within specific areas providing no consensus of what would be the best way to group this kind of data. In general, these techniques fail due to non-realistic assumptions about the true probability distribution of the data. Based on this, this thesis proposes a new measure based on Cross Information Potential that uses representative points of the dataset and statistics extracted directly from data to measure the interaction between groups. The proposed approach allows us to use all advantages of this information-theoretic descriptor and solves the limitations imposed on it by its own nature. From this, two cost functions and three algorithms have been proposed to perform cluster analysis. As the use of Information Theory captures the relationship between different patterns, regardless of assumptions about the nature of this relationship, the proposed approach was able to achieve a better performance than the main algorithms in literature. These results apply to the context of synthetic data designed to test the algorithms in specific situations and to real data extracted from problems of different fields
Resumo:
The following work is to interpret and analyze the problem of induction under a vision founded on set theory and probability theory as a basis for solution of its negative philosophical implications related to the systems of inductive logic in general. Due to the importance of the problem and the relatively recent developments in these fields of knowledge (early 20th century), as well as the visible relations between them and the process of inductive inference, it has been opened a field of relatively unexplored and promising possibilities. The key point of the study consists in modeling the information acquisition process using concepts of set theory, followed by a treatment using probability theory. Throughout the study it was identified as a major obstacle to the probabilistic justification, both: the problem of defining the concept of probability and that of rationality, as well as the subtle connection between the two. This finding called for a greater care in choosing the criterion of rationality to be considered in order to facilitate the treatment of the problem through such specific situations, but without losing their original characteristics so that the conclusions can be extended to classic cases such as the question about the continuity of the sunrise
Resumo:
In this work we investigate the stochastic behavior of a large class of systems with variable damping which are described by a time-dependent Lagrangian. Our stochastic approach is based on the Langevin treatment describing the motion of a classical Brownian particle of mass m. Two situations of physical interest are considered. In the first one, we discuss in detail an application of the standard Langevin treatment (white noise) for the variable damping system. In the second one, a more general viewpoint is adopted by assuming a given expression to the so-called collored noise. For both cases, the basic diffententiaql equations are analytically solved and al the quantities physically relevant are explicitly determined. The results depend on an arbitrary q parameter measuring how the behavior of the system departs from the standard brownian particle with constant viscosity. Several types of sthocastic behavior (superdiffusive and subdiffusive) are obteinded when the free pamameter varies continuosly. However, all the results of the conventional Langevin approach with constant damping are recovered in the limit q = 1
Resumo:
It bet on the next generation of computers as architecture with multiple processors and/or multicore processors. In this sense there are challenges related to features interconnection, operating frequency, the area on chip, power dissipation, performance and programmability. The mechanism of interconnection and communication it was considered ideal for this type of architecture are the networks-on-chip, due its scalability, reusability and intrinsic parallelism. The networks-on-chip communication is accomplished by transmitting packets that carry data and instructions that represent requests and responses between the processing elements interconnected by the network. The transmission of packets is accomplished as in a pipeline between the routers in the network, from source to destination of the communication, even allowing simultaneous communications between pairs of different sources and destinations. From this fact, it is proposed to transform the entire infrastructure communication of network-on-chip, using the routing mechanisms, arbitration and storage, in a parallel processing system for high performance. In this proposal, the packages are formed by instructions and data that represent the applications, which are executed on routers as well as they are transmitted, using the pipeline and parallel communication transmissions. In contrast, traditional processors are not used, but only single cores that control the access to memory. An implementation of this idea is called IPNoSys (Integrated Processing NoC System), which has an own programming model and a routing algorithm that guarantees the execution of all instructions in the packets, preventing situations of deadlock, livelock and starvation. This architecture provides mechanisms for input and output, interruption and operating system support. As proof of concept was developed a programming environment and a simulator for this architecture in SystemC, which allows configuration of various parameters and to obtain several results to evaluate it
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Pós-graduação em Matemática - IBILCE