66 resultados para Extremal principle
Resumo:
Although the principle of equal access to medically justified treatment has been promoted by official health policies in many Western health care systems, practices do not completely meet policy targets. Waiting times for elective surgery vary between patient groups and regions, and growing problems in the availability of services threaten equal access to treatment. Waiting times have come to the attention of decision-makers, and several policy initiatives have been introduced to ensure the availability of care within a reasonable time. In Finland, for example, the treatment guarantee came into force in 2005. However, no consensus exists on optimal waiting time for different patient groups. The purpose of this multi-centre randomized controlled trial was to analyse health-related quality of life, pain and physical function in total hip or knee replacement patients during the waiting time and to evaluate whether the waiting time is associated with patients health outcomes at admission. This study also assessed whether the length of waiting time is associated with social and health services utilization in patients awaiting total hip or knee replacement. In addition, patients health-related quality of life was compared with that of the general population. Consecutive patients with a need for a primary total hip or knee replacement due to osteoarthritis were placed on the waiting list between August 2002 and November 2003. Patients were randomly assigned to a short waiting time (maximum 3 months) or a non-fixed waiting time (waiting time not fixed in advance, instead the patient followed the hospitals routine practice). Patients health-related quality of life was measured upon being placed on the waiting list and again at hospital admission using the generic 15D instrument. Pain and physical function were evaluated using the self-report Harris Hip Score for hip patients and a scale modified from the Knee Society Clinical Rating System for knee patients. Utilization measures were the use of home health care, rehabilitation and social services, physician visits and inpatient care. Health and social services use was low in both waiting time groups. The most common services used while waiting were rehabilitation services and informal care, including unpaid care provided by relatives, neighbours and volunteers. Although patients suffered from clear restrictions in usual activities and physical functioning, they seemed primarily to lean on informal care and personal networks instead of professional care. While longer waiting time did not result in poorer health-related quality of life at admission and use of services during the waiting time was similar to that at the time of placement on the list, there is likely to be higher costs of waiting by people who wait longer simply because they are using services for a longer period. In economic terms, this would represent a negative impact of waiting. Only a few reports have been published of the health-related quality of life of patients awaiting total hip or knee replacement. These findings demonstrate that, in addition to physical dimensions of health, patients suffered from restrictions in psychological well-being such as depression, distress and reduced vitality. This raises the question of how to support patients who suffer from psychological distress during the waiting time and how to develop strategies to improve patients initiatives to reduce symptoms and the burden of waiting. Key words: waiting time, total hip replacement, total knee replacement, health-related quality of life, randomized controlled trial, outcome assessment, social service, utilization of health services
Resumo:
Nephrin is a transmembrane protein belonging to the immunoglobulin superfamily and is expressed primarily in the podocytes, which are highly differentiated epithelial cells needed for primary urine formation in the kidney. Mutations leading to nephrin loss abrogate podocyte morphology, and result in massive protein loss into urine and consequent early death in humans carrying specific mutations in this gene. The disease phenotype is closely replicated in respective mouse models. The purpose of this thesis was to generate novel inducible mouse-lines, which allow targeted gene deletion in a time and tissue-specific manner. A proof of principle model for succesful gene therapy for this disease was generated, which allowed podocyte specific transgene replacement to rescue gene deficient mice from perinatal lethality. Furthermore, the phenotypic consequences of nephrin restoration in the kidney and nephrin deficiency in the testis, brain and pancreas in rescued mice were investigated. A novel podocyte-specific construct was achieved by using standard cloning techniques to provide an inducible tool for in vitro and in vivo gene targeting. Using modified constructs and microinjection procedures two novel transgenic mouse-lines were generated. First, a mouse-line with doxycycline inducible expression of Cre recombinase that allows podocyte-specific gene deletion was generated. Second, a mouse-line with doxycycline inducible expression of rat nephrin, which allows podocyte-specific nephrin over-expression was made. Furthermore, it was possible to rescue nephrin deficient mice from perinatal lethality by cross-breeding them with a mouse-line with inducible rat nephrin expression that restored the missing endogenous nephrin only in the kidney after doxycycline treatment. The rescued mice were smaller, infertile, showed genital malformations and developed distinct histological abnormalities in the kidney with an altered molecular composition of the podocytes. Histological changes were also found in the testis, cerebellum and pancreas. The expression of another molecule with limited tissue expression, densin, was localized to the plasma membranes of Sertoli cells in the testis by immunofluorescence staining. Densin may be an essential adherens junction protein between Sertoli cells and developing germ cells and these junctions share similar protein assembly with kidney podocytes. This single, binary conditional construct serves as a cost- and time-efficient tool to increase the understanding of podocyte-specific key proteins in health and disease. The results verified a tightly controlled inducible podocyte-specific transgene expression in vitro and in vivo as expected. These novel mouse-lines with doxycycline inducible Cre recombinase and with rat nephrin expression will be useful for conditional gene targeting of essential podocyte proteins and to study in detail their functions in the adult mice. This is important for future diagnostic and pharmacologic development platforms.
Resumo:
Rhizoremediation is the use of microbial populations present in the rhizosphere of plants for environmental cleanup. The idea of this work was that bacteria living in the rhizosphere of a nitrogen-fixing leguminous plant, goat's rue (Galega orientalis), could take part in the degradation of harmful monoaromatic hydrocarbons, such as benzene, toluene and xylene (BTEX), from oil-contaminated soils. In addition to chemical (e.g. pollutant concentration) and physical (e.g. soil structure) information, the knowledge of biological aspects (e.g. bacteria and their catabolic genes) is essential when developing the rhizoremediation into controlled and effective bioremediation practice. Therefore, the need for reliable biomonitoring methods is obvious. The main aims of this thesis were to evaluate the symbiotic G. orientalis - Rhizobium galegae system for rhizoremediation of oil-contaminated soils, to develop molecular methods for biomonitoring, and to apply these methods for studying the microbiology of rhizoremediation. In vitro, Galega plants and rhizobia remained viable in m-toluate concentrations up to 3000 mg/l. Plant growth and nodulation were inhibited in 500 mg/l m-toluate, but were restored when plants were transferred to clean medium. In the greenhouse, Galega showed good growth, nodulation and nitrogen fixation, and developed a strong rhizosphere in soils contaminated with oil or spiked with 2000 mg/l m-toluate. The high aromatic tolerance of R. galegae and the viability of Galega plants in oil-polluted soils proved this legume system to be a promising method for the rhizoremediation of oil-contaminated soils. Molecular biomonitoring methods were designed and/or developed further for bacteria and their degradation genes. A combination of genomic fingerprinting ((GTG)5-PCR), taxonomic ribotyping of 16S rRNA genes and partial 16S rRNA gene sequencing were chosen for molecular grouping of culturable, heterogeneous rhizosphere bacteria. PCR primers specific for the xylE gene were designed for TOL plasmid detection. Amplified enzyme-coding DNA restriction analysis (AEDRA) with AluI was used to profile both TOL plasmids (xylE primers) and, in general, aromatics-degrading plasmids (C230 primers). The sensitivity of the direct monitoring of TOL plasmids in soil was enhanced by nested C23O-xylE-PCR. Rhizosphere bacteria were isolated from the greenhouse and field lysimeter experiments. High genetic diversity was observed among the 50 isolated, m-toluate tolerating rhizosphere bacteria in the form of five major lineages of the Bacteria domain. Gram-positive Rhodococcus, Bacillus and Arthrobacter and gram-negative Pseudomonas were the most abundant genera. The inoculum Pseudomonas putida PaW85/pWW0 was not found in the rhizosphere samples. Even if there were no ecological niches available for the bioaugmentation bacterium itself, its conjugative catabolic plasmid might have had some additional value for other bacterial species and thus, for rhizoremediation. Only 10 to 20% of the isolated, m-toluate tolerating bacterial strains were also able to degrade m-toluate. TOL plasmids were a major group of catabolic plasmids among these bacteria. The ability to degrade m-toluate by using enzymes encoded by a TOL plasmid was detected only in species of the genus Pseudomonas, and the best m-toluate degraders were these Pseudomonas species. Strain-specific differences in degradation abilities were found for P.oryzihabitans and P. migulae: some of these strains harbored a TOL plasmid - a new finding observed in this work, indicating putative horizontal plasmid transfer in the rhizosphere. One P. oryzihabitans strain harbored the pWW0 plasmid that had probably conjugated from the bioaugmentation Pseudomonas. Some P. migulae and P. oryzihabitans strains seemed to harbor both the pWW0- and the pDK1-type TOL plasmid. Alternatively, they might have harbored a TOL plasmid with both the pWW0- and the pDK1-type xylE gene. The breakdown of m-toluate by gram-negative bacteria was not restricted to the TOL pathway. Also some gram-positive Rhodococcus erythropolis and Arthrobacter aurescens strains were able to degrade m-toluate in the absence of a TOL plasmid. Three aspects of the rhizosphere effect of G. orientalis were manifested in oil-contaminated soil in the field: 1) G. orientalis and Pseudomonas bioaugmentation increased the amount of rhizosphere bacteria. G. orientalis especially together with Pseudomonas bioaugmentation increased the numbers of m-toluate utilizing and catechol positive bacteria indicating an increase in degradation potential. 2) Also the bacterial diversity, when measured as the amount of ribotypes, was increased in the Galega rhizosphere with or without Pseudomonas bioaugmentation. However, the diversity of m-toluate utilizing bacteria did not significantly increase. At the community level, by using the 16S rRNA gene PCR-DGGE method, the highest diversity of species was also observed in vegetated soils compared with non-vegetated soils. Diversified communities may best guarantee the overall success in rhizoremediation by offering various genetic machineries for catabolic processes. 3) At the end of the experiment, no TOL plasmid could be detected by direct DNA analysis in soil treated with both G. orientalis and Pseudomonas. The detection limit for TOL plasmids was encountered indicating decreased amount of degradation plasmids and thus, the success of rhizoremediation. The use of G. orientalis for rhizoremediation is unique. In this thesis new information was obtained about the rhizosphere effect of Galega orientalis in BTEX contaminated soils. The molecular biomonitoring methods can be applied for several purposes within environmental biotechnology, such as for evaluating the intrinsic biodegradation potential, monitoring the enhanced bioremediation, and estimating the success of bioremediation. Environmental protection by using nature's own resources and thus, acting according to the principle of sustainable development, would be both economically and environmentally beneficial for society. Keywords: molecular biomonitoring, genetic fingerprinting, soil bacteria, bacterial diversity, TOL plasmid, catabolic genes, horizontal gene transfer, rhizoremediation, rhizosphere effect, Galega orientalis, aerobic biodegradation, petroleum hydrocarbons, BTEX
Resumo:
There exists various suggestions for building a functional and a fault-tolerant large-scale quantum computer. Topological quantum computation is a more exotic suggestion, which makes use of the properties of quasiparticles manifest only in certain two-dimensional systems. These so called anyons exhibit topological degrees of freedom, which, in principle, can be used to execute quantum computation with intrinsic fault-tolerance. This feature is the main incentive to study topological quantum computation. The objective of this thesis is to provide an accessible introduction to the theory. In this thesis one has considered the theory of anyons arising in two-dimensional quantum mechanical systems, which are described by gauge theories based on so called quantum double symmetries. The quasiparticles are shown to exhibit interactions and carry quantum numbers, which are both of topological nature. Particularly, it is found that the addition of the quantum numbers is not unique, but that the fusion of the quasiparticles is described by a non-trivial fusion algebra. It is discussed how this property can be used to encode quantum information in a manner which is intrinsically protected from decoherence and how one could, in principle, perform quantum computation by braiding the quasiparticles. As an example of the presented general discussion, the particle spectrum and the fusion algebra of an anyon model based on the gauge group S_3 are explicitly derived. The fusion algebra is found to branch into multiple proper subalgebras and the simplest one of them is chosen as a model for an illustrative demonstration. The different steps of a topological quantum computation are outlined and the computational power of the model is assessed. It turns out that the chosen model is not universal for quantum computation. However, because the objective was a demonstration of the theory with explicit calculations, none of the other more complicated fusion subalgebras were considered. Studying their applicability for quantum computation could be a topic of further research.
Resumo:
In technicolor theories the scalar sector of the Standard Model is replaced by a strongly interacting sector. Although the Standard Model has been exceptionally successful, the scalar sector causes theoretical problems that make these theories seem an attractive alternative. I begin my thesis by considering QCD, which is the known example of strong interactions. The theory exhibits two phenomena: confinement and chiral symmetry breaking. I find the low-energy dynamics to be similar to that of the sigma models. Then I analyze the problems of the Standard Model Higgs sector, mainly the unnaturalness and triviality. Motivated by the example of QCD, I introduce the minimal technicolor model to resolve these problems. I demonstrate the minimal model to be free of anomalies and then deduce the main elements of its low-energy particle spectrum. I find the particle spectrum contains massless or very light technipions, and also technibaryons and techni-vector mesons with a high mass of over 1 TeV. Standard Model fermions remain strictly massless at this stage. Thus I introduce the technicolor companion theory of flavor, called extended technicolor. I show that the Standard Model fermions and technihadrons receive masses, but that they remain too light. I also discuss flavor-changing neutral currents and precision electroweak measurements. I then show that walking technicolor models partly solve these problems. In these models, contrary to QCD, the coupling evolves slowly over a large energy scale. This behavior adds to the masses so that even the light technihadrons are too heavy to be detected at current particle accelerators. Also all observed masses of the Standard Model particles can be generated, except for the bottom and top quarks. Thus it is shown in this thesis that, excluding the masses of third generation quarks, theories based on walking technicolor can in principle produce the observed particle spectrum.
Resumo:
The Standard Model of particle physics consists of the quantum electrodynamics (QED) and the weak and strong nuclear interactions. The QED is the basis for molecular properties, and thus it defines much of the world we see. The weak nuclear interaction is responsible for decays of nuclei, among other things, and in principle, it should also effects at the molecular scale. The strong nuclear interaction is hidden in interactions inside nuclei. From the high-energy and atomic experiments it is known that the weak interaction does not conserve parity. Consequently, the weak interaction and specifically the exchange of the Z^0 boson between a nucleon and an electron induces small energy shifts of different sign for mirror image molecules. This in turn will make the other enantiomer of a molecule energetically favorable than the other and also shifts the spectral lines of the mirror image pair of molecules into different directions creating a split. Parity violation (PV) in molecules, however, has not been observed. The topic of this thesis is how the weak interaction affects certain molecular magnetic properties, namely certain parameters of nuclear magnetic resonance (NMR) and electron spin resonance (ESR) spectroscopies. The thesis consists of numerical estimates of NMR and ESR spectral parameters and investigations of the effects of different aspects of quantum chemical computations to them. PV contributions to the NMR shielding and spin-spin coupling constants are investigated from the computational point of view. All the aspects of quantum chemical electronic structure computations are found to be very important, which makes accurate computations challenging. Effects of molecular geometry are also investigated using a model system of polysilyene chains. PV contribution to the NMR shielding constant is found to saturate after the chain reaches a certain length, but the effects of local geometry can be large. Rigorous vibrational averaging is also performed for a relatively small and rigid molecule. Vibrational corrections to the PV contribution are found to be only a couple of per cents. PV contributions to the ESR g-tensor are also evaluated using a series of molecules. Unfortunately, all the estimates are below the experimental limits, but PV in some of the heavier molecules comes close to the present day experimental resolution.
Resumo:
Atomic layer deposition (ALD) is a method for thin film deposition which has been extensively studied for binary oxide thin film growth. Studies on multicomponent oxide growth by ALD remain relatively few owing to the increased number of factors that come into play when more than one metal is employed. More metal precursors are required, and the surface may change significantly during successive stages of the growth. Multicomponent oxide thin films can be prepared in a well-controlled way as long as the same principle that makes binary oxide ALD work so well is followed for each constituent element: in short, the film growth has to be self-limiting. ALD of various multicomponent oxides was studied. SrTiO3, BaTiO3, Ba(1-x)SrxTiO3 (BST), SrTa2O6, Bi4Ti3O12, BiTaO4 and SrBi2Ta2O9 (SBT) thin films were prepared, many of them for the first time by ALD. Chemistries of the binary oxides are shown to influence the processing of their multicomponent counterparts. The compatibility of precursor volatilities, thermal stabilities and reactivities is essential for multicomponent oxide ALD, but it should be noted that the main reactive species, the growing film itself, must also be compatible with self-limiting growth chemistry. In the cases of BaO and Bi2O3 the growth of the binary oxide was very difficult, but the presence of Ti or Ta in the growing film made self-limiting growth possible. The application of the deposited films as dielectric and ferroelectric materials was studied. Post-deposition annealing treatments in different atmospheres were used to achieve the desired crystalline phase or, more generally, to improve electrical properties. Electrode materials strongly influenced the leakage current densities in the prepared metal insulator metal (MIM) capacitors. Film permittivities above 100 and leakage current densities below 110-7 A/cm2 were achieved with several of the materials.
Resumo:
In this Thesis, we develop theory and methods for computational data analysis. The problems in data analysis are approached from three perspectives: statistical learning theory, the Bayesian framework, and the information-theoretic minimum description length (MDL) principle. Contributions in statistical learning theory address the possibility of generalization to unseen cases, and regression analysis with partially observed data with an application to mobile device positioning. In the second part of the Thesis, we discuss so called Bayesian network classifiers, and show that they are closely related to logistic regression models. In the final part, we apply the MDL principle to tracing the history of old manuscripts, and to noise reduction in digital signals.
Resumo:
Minimum Description Length (MDL) is an information-theoretic principle that can be used for model selection and other statistical inference tasks. There are various ways to use the principle in practice. One theoretically valid way is to use the normalized maximum likelihood (NML) criterion. Due to computational difficulties, this approach has not been used very often. This thesis presents efficient floating-point algorithms that make it possible to compute the NML for multinomial, Naive Bayes and Bayesian forest models. None of the presented algorithms rely on asymptotic analysis and with the first two model classes we also discuss how to compute exact rational number solutions.
Resumo:
The Minimum Description Length (MDL) principle is a general, well-founded theoretical formalization of statistical modeling. The most important notion of MDL is the stochastic complexity, which can be interpreted as the shortest description length of a given sample of data relative to a model class. The exact definition of the stochastic complexity has gone through several evolutionary steps. The latest instantation is based on the so-called Normalized Maximum Likelihood (NML) distribution which has been shown to possess several important theoretical properties. However, the applications of this modern version of the MDL have been quite rare because of computational complexity problems, i.e., for discrete data, the definition of NML involves an exponential sum, and in the case of continuous data, a multi-dimensional integral usually infeasible to evaluate or even approximate accurately. In this doctoral dissertation, we present mathematical techniques for computing NML efficiently for some model families involving discrete data. We also show how these techniques can be used to apply MDL in two practical applications: histogram density estimation and clustering of multi-dimensional data.
Resumo:
One of the most tangled fields of research is the field of defining and modeling affective concepts, i. e. concepts regarding emotions and feelings. The subject can be approached from many disciplines. The main problem is lack of generally approved definitions. However, e.g. linguists have recently started to check the consistency of their theories with the help of computer simulations. Definitions of affective concepts are needed for performing similar simulations in behavioral sciences. In this thesis, preliminary computational definitions of affects for a simple utility-maximizing agent are given. The definitions have been produced by synthetizing ideas from theories from several fields of research. The class of affects is defined as a superclass of emotions and feelings. Affect is defined as a process, in which a change in an agent's expected utility causes a bodily change. If the process is currently under the attention of the agent (i.e. the agent is conscious of it), the process is a feeling. If it is not, but can in principle be taken into attention (i.e. it is preconscious), the process is an emotion. Thus, affects do not presuppose consciousness, but emotions and affects do. Affects directed at unexpected materialized (i.e. past) events are delight and fright. Delight is the consequence of an unexpected positive event and fright is the consequence of an unexpected negative event. Affects directed at expected materialized (i.e. past) events are happiness (expected positive event materialized), disappointment (expected positive event did not materialize), sadness (expected negative event materialized) and relief (expected negative event did not materialize). Affects directed at expected unrealized (i.e. future) events are fear and hope. Some other affects can be defined as directed towards originators of the events. The affect classification has also been implemented as a computer program, the purpose of which is to ensure the coherence of the definitions and also to illustrate the capabilities of the model. The exact content of bodily changes associated with specific affects is not considered relevant from the point of view of the logical structure of affective phenomena. The utility function need also not be defined, since the target of examination is only its dynamics.
Resumo:
The aim of this paper is to present the evolution of the Francovich doctrine within the European legal order. The first part deals with the gradual development of the ECJ's case law on State liability in damages for breach of EC law. Starting from the seminal Francovich and Brasserie du Pêcheur, the clarification of the criteria set by the Court is attempted with reference to subsequent case law, whereas issues concerning the extent and form of the compensation owned are also mentioned. The second part concerns one of the more recent developments in the field, namely State liability for breaches of Community law attributed to national judiciary. The Court's ruling in Köbler is examined in connection with two other recent judgments, namely Commission v. Italy of 2003 and Kühne & Heitz, as an attempt of the ECJ to reframe its relationships with national supreme courts and appropriate for itself the position of the Supreme Court in the European legal order. The implications on State liability claims by the ruling in Commission v. France of 1997 constitute the theme of the third part, where it is submitted that Member States can also be held liable for disregard of Community law by private individuals within their respected territories. To this extent, Schmidberger is viewed as a manifestation of this opinion, with fundamental rights acquiring a new dimension, being invoked by the States, contra the individuals as a shield to liability claims. Finally, the third part examines the relationship between the Francovich doctrine and the principle of legal certainty and concludes that the solutions employed by the ECJ have been both predictable and acceptable by the national legal orders. Keywords: State liability, damages, Francovich, Köbler, Schmidberger
The Mediated Immediacy : João Batista Libanio and the Question of Latin American Liberation Theology
Resumo:
This study is a systematic analysis of mediated immediacy in the production of the Brazilian professor of theology João Batista Libanio. He stresses both ethical mediation and the immediate character of the faith. Libanio has sought an answer to the problem of science and faith. He makes use of the neo-scholastic distinction between matter and form. According to St. Thomas Aquinas, God cannot be known as a scientific object, but it is possible to predicate a formal theological content of other subject matter with the help of revelation. This viewpoint was emphasized in neo-Thomism and supported by the liberation theologians. For them, the material starting point was social science. It becomes a theologizable or revealable (revelabile) reality. This social science has its roots in Latin American Marxism which was influenced by the school of Louis Althusser and considered Marxism a science of history . The synthesis of Thomism and Marxism is a challenge Libanio faced, especially in his Teologia da libertação from 1987. He emphasized the need for a genuinely spiritual and ethical discernment, and was particularly critical of the ethical implications of class struggle. Libanio s thinking has a strong hermeneutic flavor. It is more important to understand than to explain. He does not deny the need for social scientific data, but that they cannot be the exclusive starting point of theology. There are different readings of the world, both scientific and theological. A holistic understanding of the nature of religious experience is needed. Libanio follows the interpretation given by H. C. de Lima Vaz, according to whom the Hegelian dialectic is a rational circulation between the totality and its parts. He also recalls Oscar Cullmann s idea of God s Kingdom that is already and not yet . In other words, there is a continuous mediation of grace into the natural world. This dialectic is reflected in ethics. Faith must be verified in good works. Libanio uses the Thomist fides caritate formata principle and the modern orthopraxis thinking represented by Edward Schillebeeckx. One needs both the ortho of good faith and the praxis of the right action. The mediation of praxis is the mediation of human and divine love. Libanio s theology has strong roots in the Jesuit spirituality that places the emphasis on contemplation in action.
Resumo:
Can war be justified? Expressions of opinions by the general assemblies of the World Council of Churches on the question of war as a method of settling conflicts. The purpose of this study is to describe and analyse the expressions of opinions recorded in the documents of the general assemblies of the WCC during the Cold War period from 1948 to 1983 on the use of war as a method of settling international and national conflicts. The main sources are the official reports of the WCC´s assemblies during the years 1948 to 1983. This study divides the discussions into three periods. The first period (1949-1968) is dominated by the pressures arising from the Second World War. Experiences of the war led the assemblies of the WCC to the conclusion that modern warfare as a method of settling conflicts should be rejected. Modern war was contrary to God´s purposes and the whole meaning of creation, said the assembly. Although the WCC rejected modern war, it left open the possibility of conflict where principles of just war may be practised. The question of war was also linked to the state and its function, which led to the need to create a politically neutral doctrine for the socio-ethical thinking of churches and of the WCC itself. The doctrine was formulated using the words "responsible society". The question of war and socio-ethical thinking were on the WCC`s agenda throughout the first period. Another issue that had an influence on the first period was the increasing role of Third World countries. This new dimension also brought new aspects to the question of war and violence. The second period (1968-1975) presented greater challenges to the WCC, especially in traditional western countries. The Third World, political activity in the socialist world and ideas of revolution were discussed. The WCC`s fourth Assembly in Uppsala was challenged by these new ideas of revolution. The old doctrine of "responsible society" was seen by many participants as unsuitable in the modern world, especially for Third World countries. The situation of a world governed by armaments, causing social and economic disruption, was felt by churches to be problematic. The peace movement gathered pace and attention. There was pressure to see armed forces as an option on the way to a new world order. The idea of a just war was challenged by that of just revolution. These ideas of revolution did not receive support from the Uppsala Assembly, but they pressured the WCC to reconsider its socio-ethical thinking. Revolution was seen as a possibility, but only when it could be peaceful. In the Nairobi Assembly the theme of just, participatory and sustainable society provided yet another viewpoint, dealing with the life of the world and its problems as a whole. The third period (1975-1983) introduced a new, alternative doctrine the "JPIC Process", justice, peace and the integrity of creation for social thinking in the WCC. The WCC no longer wanted to discuss war or poverty as separate questions, but wanted to combine all aspects of life to see the impact of an arms-governed world on humankind. Thus, during the last period, discussions focused on socio-ethical questions, where war and violence were only parts of a larger problem. Through the new JPIC Process, the WCC`s Assembly in Vancouver looked for a new world, one without violence, in all aspects of life. Despite differing opinions in socio-ethical thinking, the churches in the WCC agreed that modern warfare cannot be regarded as acceptable or just. The old idea of a "just war" still had a place, but it was not seen by all as a valid principle. As a result the WCC viewed war as a final solution to be employed when all other methods had failed. Such a war would have to secure peace and justice for all. In the discussions there was a strong political east-west divide, and, during the last two decades, a north-south divide as well. The effect of the Cold War was obvious. In the background to the theological positions were two main concepts namely the idea of God´s activity in man´s history through the so-called regiments and, the concept of the Kingdom of God on Earth.
Kirkon vai valtion kirjat? : Uskontokuntasidonnaisuuden ongelma Suomen väestökirjanpidossa 1839-1904
Resumo:
The Population Register – run by the Church or the state? The problem posed by the obligation to belong to a religious community in the registration of births and deaths in Finland between 1839 and 1904 The Lutheran Church of Finland is the nation’s largest church; approximately 82 per cent of Finns were members in 2007. The Church ran an official register of its members until 1999, when the state then undertook this task. The registration of births and deaths by the Church has a long history dating back to the 17th century, when Bishop Johannes Gezelius Sr. decreed that all parish members would have to be recorded in parish registers. These registers were used to control how well parish members knew the Christian doctrine and, gradually, also if they were literate. Additionally, the Church attempted to ensure by means of the parish registers that parish members went to Holy Communion annually. Since everyone was a member of the Lutheran Church, the state also took advantage of the parish registers and used them for the purposes of tax collection and conscription. The main research theme of “The Population Register – run by the Church or the state?” goes back to these times. The actual research period covers the years of 1839–1904. At that time Finland was under Russian rule, although autonomous. In the late 19th century the press and different associations in Finland began to engage in public debate, and the country started moving from a submissive society to a civic one. The identity of the Lutheran Church also became more prominent when the Church Act and the General Synod were realised in 1869. A few years earlier, municipal and parish administrations had been separated, but the general registration of births and deaths was left to the Church to see to. In compliance with the constitution of the country, all the inhabitants in principle still had to be Lutheran. In practice, the situation was different. The religious and ideological realms diversified, and the Lutheran concept of religion was no longer acceptable to everyone. The conflict was reflected in the registration of births and deaths, which was linked to the Lutheran Church and its parish registers. Nobody was allowed to leave the Church, there was no civil register, and the Lutheran Church did not consent to record unbaptized children in the parish registers. Therefore such children were left without civil rights. Thus the obligation to belong to a religious community had become a problem in the registration of births and deaths. The Lutheran clergy also appealed to the 1723 privileges, according to which they had been exempted from the drawing up of additional population registers. In 1889 Finland passed the Dissenters Act. By virtue of this act the Baptists and the Methodists left the state Church, but this was not the case with the members of the free churches. The freethinkers had to retain their church membership, as the law did not apply to them. This meant that the unbaptized children of the members of the free churches or those of freethinkers were still not entered in any registers. The children were not able to go to school, work for the state or legally marry. Neither were they able to inherit property, as they did not legally exist. The system of parish registers was created when everyone was required to be a member of the Lutheran Church, but it did not work when liberal attitudes eventually penetrated the sphere of religion, too. The government´s measures to solve the problem were slow and cautious, partly because Finland was part of Russia, partly because there were only about 100 unbaptized children. As the problem group was small and the state´s resources were limited, no general civil register was established. The state accepted the fact that in spite of the problems, the Evangelical Lutheran Church and the congregations of dissenters were the only official establishments to run populations registers in the country, and for social purposes, too. In 1900 the Diet of Finland finally approved a limited civil register, which unbaptized children and unregistered foreigners would be recorded in. Due to political reasons the civil register did not come into existence until 1917, after the actual research period.