978 resultados para Ambiguous Figures


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Throughout the history of Linnean taxonomy, species have been described with varying degrees of justification. Many descriptions have been based on only a few ambiguous morphological characters. Moreover, species have been considered natural, well-defined units whereas higher taxa have been treated as disparate, non-existent creations. In the present thesis a few such cases were studied in detail. Often the species-level descriptions were based on only a few specimens and the variation previously thought to be interspecific was found to be intraspecific. In some cases morphological characters were sufficient to resolve the evolutionary relationships between the taxa, but generally more resolution was gained by the addition of molecular evidence. However, both morphological and molecular data were found to be deceptive in some cases. The DNA sequences of morphologically similar specimens were found to differ distinctly in some cases, whereas in other closely related species the morphology of specimens with identical DNA sequences differed substantially. This study counsels caution when evolutionary relationships are being studied utilizing only one source of evidence or a very limited number of characters (e.g. barcoding). Moreover, it emphasizes the importance of high quality data as well as the utilization of proper methods when making scientific inferences. Properly conducted analyses produce robust results that can be utilized in numerous interesting ways. The present thesis considered two such extensions of systematics. A novel hypothesis on the origin of bioluminescence in Elateriformia beetles is presented, tying it to the development of the clicking mechanism in the ancestors of these animals. An entirely different type of extension of systematics is the proposed high value of the white sand forests in maintaining the diversity of beetles in the Peruvian Amazon. White sand forests are under growing pressure from human activities that lead to deforestation. They were found to harbor an extremely diverse beetle fauna and many taxa were specialists living only in this unique habitat. In comparison to the predominant clay soil forests, considerably more elateroid beetles belonging to all studied taxonomic levels (species, genus, tribus, and subfamily) were collected in white sand forests. This evolutionary diversity is hypothesized to be due to a combination of factors: (1) the forest structure, which favors the fungus-plant interactions important for the elateroid beetles, (2) the old age of the forest type favoring survival of many evolutionary lineages and (3) the widespread distribution and fragmentation of the forests in the Miocene, favoring speciation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Industrial ecology is an important field of sustainability science. It can be applied to study environmental problems in a policy relevant manner. Industrial ecology uses ecosystem analogy; it aims at closing the loop of materials and substances and at the same time reducing resource consumption and environmental emissions. Emissions from human activities are related to human interference in material cycles. Carbon (C), nitrogen (N) and phosphorus (P) are essential elements for all living organisms, but in excess have negative environmental impacts, such as climate change (CO2, CH4 N2O), acidification (NOx) and eutrophication (N, P). Several indirect macro-level drivers affect emissions change. Population and affluence (GDP/capita) often act as upward drivers for emissions. Technology, as emissions per service used, and consumption, as economic intensity of use, may act as drivers resulting in a reduction in emissions. In addition, the development of country-specific emissions is affected by international trade. The aim of this study was to analyse changes in emissions as affected by macro-level drivers in different European case studies. ImPACT decomposition analysis (IPAT identity) was applied as a method in papers I III. The macro-level perspective was applied to evaluate CO2 emission reduction targets (paper II) and the sharing of greenhouse gas emission reduction targets (paper IV) in the European Union (EU27) up to the year 2020. Data for the study were mainly gathered from official statistics. In all cases, the results were discussed from an environmental policy perspective. The development of nitrogen oxide (NOx) emissions was analysed in the Finnish energy sector during a long time period, 1950 2003 (paper I). Finnish emissions of NOx began to decrease in the 1980s as the progress in technology in terms of NOx/energy curbed the impact of the growth in affluence and population. Carbon dioxide (CO2) emissions related to energy use during 1993 2004 (paper II) were analysed by country and region within the European Union. Considering energy-based CO2 emissions in the European Union, dematerialization and decarbonisation did occur, but not sufficiently to offset population growth and the rapidly increasing affluence during 1993 2004. The development of nitrogen and phosphorus load from aquaculture in relation to salmonid consumption in Finland during 1980 2007 was examined, including international trade in the analysis (paper III). A regional environmental issue, eutrophication of the Baltic Sea, and a marginal, yet locally important source of nutrients was used as a case. Nutrient emissions from Finnish aquaculture decreased from the 1990s onwards: although population, affluence and salmonid consumption steadily increased, aquaculture technology improved and the relative share of imported salmonids increased. According to the sustainability challenge in industrial ecology, the environmental impact of the growing population size and affluence should be compensated by improvements in technology (emissions/service used) and with dematerialisation. In the studied cases, the emission intensity of energy production could be lowered for NOx by cleaning the exhaust gases. Reorganization of the structure of energy production as well as technological innovations will be essential in lowering the emissions of both CO2 and NOx. Regarding the intensity of energy use, making the combustion of fuels more efficient and reducing energy use are essential. In reducing nutrient emissions from Finnish aquaculture to the Baltic Sea (paper III) through technology, limits of biological and physical properties of cultured fish, among others, will eventually be faced. Regarding consumption, salmonids are preferred to many other protein sources. Regarding trade, increasing the proportion of imports will outsource the impacts. Besides improving technology and dematerialization, other viewpoints may also be needed. Reducing the total amount of nutrients cycling in energy systems and eventually contributing to NOx emissions needs to be emphasized. Considering aquaculture emissions, nutrient cycles can be partly closed through using local fish as feed replacing imported feed. In particular, the reduction of CO2 emissions in the future is a very challenging task when considering the necessary rates of dematerialisation and decarbonisation (paper II). Climate change mitigation may have to focus on other greenhouse gases than CO2 and on the potential role of biomass as a carbon sink, among others. The global population is growing and scaling up the environmental impact. Population issues and growing affluence must be considered when discussing emission reductions. Climate policy has only very recently had an influence on emissions, and strong actions are now called for climate change mitigation. Environmental policies in general must cover all the regions related to production and impacts in order to avoid outsourcing of emissions and leakage effects. The macro-level drivers affecting changes in emissions can be identified with the ImPACT framework. Statistics for generally known macro-indicators are currently relatively well available for different countries, and the method is transparent. In the papers included in this study, a similar method was successfully applied in different types of case studies. Using transparent macro-level figures and a simple top-down approach are also appropriate in evaluating and setting international emission reduction targets, as demonstrated in papers II and IV. The projected rates of population and affluence growth are especially worth consideration in setting targets. However, sensitivities in calculations must be carefully acknowledged. In the basic form of the ImPACT model, the economic intensity of consumption and emission intensity of use are included. In seeking to examine consumption but also international trade in more detail, imports were included in paper III. This example demonstrates well how outsourcing of production influences domestic emissions. Country-specific production-based emissions have often been used in similar decomposition analyses. Nevertheless, trade-related issues must not be ignored.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It has been said that we are living in a golden age of innovation. New products, systems and services aimed to enable a better future, have emerged from novel interconnections between design and design research with science, technology and the arts. These intersections are now, more than ever, catalysts that enrich daily activities for health and safety, education, personal computing, entertainment and sustainability, to name a few. Interactive functions made possible by new materials, technology, and emerging manufacturing solutions demonstrate an ongoing interplay between cross-disciplinary knowledge and research. Such interactive interplay bring up questions concerning: (i) how art and design provide a focus for developing design solutions and research in technology; (ii) how theories emerging from the interactions of cross-disciplinary knowledge inform both the practice and research of design and (iii) how research and design work together in a mutually beneficial way. The IASDR2015 INTERPLAY EXHIBITION provides some examples of these interconnections of design research with science, technology and the arts. This is done through the presentation of objects, artefacts and demonstrations that are contextualised into everyday activities across various areas including health, education, safety, furniture, fashion and wearable design. The exhibits provide a setting to explore the various ways in which design research interacts across discipline knowledge and approaches to stimulate innovation. In education, Designing South African Children’s Health Education as Generative Play (A Bennett, F Cassim, M van der Merwe, K van Zijil, and M Ribbens) presents a set of toolkits that resulted from design research entailing generative play. The toolkits are systems that engender pleasure and responsibility, and are aimed at cultivating South African’s youth awareness of nutrition, hygiene, disease awareness and prevention, and social health. In safety, AVAnav: Avalanche Rescue Helmet (Jason Germany) delivers an interactive system as a tool to contribute to reduce the time to locate buried avalanche victims. Helmet-mounted this system responds to the contextual needs of rescuers and has since led to further design research on the interface design of rescuing devices. In apparel design and manufacturing, Shrinking Violets: Fashion design for disassembly (Alice Payne) proposes a design for disassembly through the use of beautiful reversible mono-material garments that interactively responds to the challenges of garment construction in the fashion industry, capturing the metaphor for the interplay between technology and craft in the fashion manufacturing industry. Harvest: A biotextile future (Dean Brough and Alice Payne), explores the interplay of biotechnology, materiality and textile design in the creation of sustainable, biodegradable vegan textile through the process of a symbiotic culture of bacteria and yeast (SCOBY). SCOBY is a pellicle curd that can be harvested, machine washed, dried and cut into a variety of designs and texture combinations. The exploration of smart materials, wearable design and micro-electronics led to creative and aesthetically coherent stimulus-reactive jewellery; Symbiotic Microcosms: Crafting Digital Interaction (K Vones). This creation aims to bridge the gap between craft practitioner and scientific discovery, proposing a move towards the notion of a post-human body, where wearable design is seen as potential ground for new human-computer interactions, affording the development of visually engaging multifunctional enhancements. In furniture design, Smart Assistive chair for older adults (Chao Zhao) demonstrates how cross-disciplinary knowledge interacting with design strategies provide solution that employed new technological developments in older aged care, and the participation of multiple stakeholders: designers, health care system and community based health systems. In health, Molecular diagnosis system for newborns deafness genetic screening (Chao Zhao) presents an ambitious and complex project that includes a medical device aimed at resolving a number of challenges: technical feasibility for city and rural contexts, compatibility with standard laboratory and hospital systems, access to health system, and support the work of different hospital specialists. The interplay between cross-disciplines is evident in this work, demonstrating how design research moves forward through technology developments. These works exemplify the intersection between domains as a means to innovation. Novel design problems are identified as design intersects with the various areas. Research informs this process, and in different ways. We see the background investigation into the contextualising domain (e.g. on-snow studies, garment recycling, South African health concerns, the post human body) to identify gaps in the area and design criteria; the technologies and materials reviews (e.g. AR, biotextiles) to offer plausible technical means to solve these, as well as design criteria. Theoretical reviews can also inform the design (e.g. play, flow). These work together to equip the design practitioner with a robust set of ‘tools’ for design innovation – tools that are based in research. The process identifies innovative opportunity and criteria for design and this, in turn, provides a means for evaluating the success of the design outcomes. Such an approach has the potential to come full circle between research and design – where the design can function as an exemplar, evidencing how the research-articulated problems can be solved. Core to this, however, is the evaluation of the design outcome itself and identifying knowledge outcomes. In some cases, this is fairly straightforward that is, easily measurable. For example the efficacy of Jason Germany’s helmet can be determined by measuring the reduced response time in the rescuer. Similarly the improved ability to recycle Payne’s panel garments can be clearly determined by comparing it to those recycling processes (and her identified criteria of separating textile elements!); while the sustainability and durability of the Brough & Payne’s biotextile can be assessed by documenting the growth and decay processes, or comparative strength studies. There are however situations where knowledge outcomes and insights are not so easily determined. Many of the works here are open-ended in their nature, as they emphasise the holistic experience of one or more designs, in context: “the end result of the art activity that provides the health benefit or outcome but rather, the value lies in the delivery and experience of the activity” (Bennet et al.) Similarly, reconfiguring layers of laser cut silk in Payne’s Shrinking Violets constitutes a customisable, creative process of clothing oneself since it “could be layered to create multiple visual effects”. Symbiotic Microcosms also has room for facilitating experience, as the work is described to facilitate “serendipitous discovery”. These examples show the diverse emphasis of enquiry as on the experience versus the product. Open-ended experiences are ambiguous, multifaceted and differ from person to person and moment to moment (Eco 1962). Determining the success is not always clear or immediately discernible; it may also not be the most useful question to ask. Rather, research that seeks to understand the nature of the experience afforded by the artefact is most useful in these situations. It can inform the design practitioner by helping them with subsequent re-design as well as potentially being generalizable to other designers and design contexts. Bennett et. al exemplify how this may be approached from a theoretical perspective. This work is concerned with facilitating engaging experiences to educate and, ultimately impact on that community. The research is concerned with the nature of that experience as well, and in order to do so the authors have employed theoretical lenses – here these are of flow, pleasure, play. An alternative or complementary approach to using theory, is using qualitative studies such as interviews with users to ask them about what they experienced? Here the user insights become evidence for generalising across, potentially revealing insight into relevant concerns – such as the range of possible ‘playful’ or experiences that may be afforded, or the situation that preceded a ‘serendipitous discovery’. As shown, IASDR2015 INTERPLAY EXHIBITION provides a platform for exploration, discussion and interrogation around the interplay of design research across diverse domains. We look forward with excitement as IASDR continues to bring research and design together, and as our communities of practitioners continue to push the envelope of what is design and how this can be expanded and better understood with research to foster new work and ultimately, stimulate innovation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The English country dance The Salamanca Castanets was published in Button and Whittaker’s Collection of Country Dances for the Year 1813. It is a captivating tune which utilises country dance figures whilst capturing and maintaining the Spanish exuberance. Its fascinating history still resonates in Tasmania.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Intensive care is to be provided to patients benefiting from it, in an ethical, efficient, effective and cost-effective manner. This implies a long-term qualitative and quantitative analysis of intensive care procedures and related resources. The study population consists of 2709 patients treated in the general intensive care unit (ICU) of Helsinki University Hospital. Study sectors investigate intensive care patients mortality, quality of life (QOL), Quality-Adjusted Life-Years (QALY units) and factors related to severity of illness, length of stay (LOS), patient s age, evaluation period as well as experiences and memories connected with the ICU episode. In addition, the study examines the qualities of two QOL measures, the RAND 36 Item Health Survey 1.0 (RAND-36) and the 5 Item EuroQol-5D (EQ-5D) and assesses the correlation of the test results. Patients treated in 1995 responded to the RAND-36 questionnaire in 1996. All patients, treated from 1995-2000, received a QOL questionnaires in 2001, when 1 7 years had lapsed from the intensive treatment. Response rate was 79.5 %. Main Results 1) Of the patients who died within the first year (n = 1047) 66 % died during the intensive care period or within the following month. The non-survivors were more aged than the surviving patients, had generally a higher than average APACHE II and SOFA score depicting the severity of illness, their ICU LOS was longer and hospital stay shorter than of the surviving patients (p < 0.001). Mortality of patients receiving conservative treatment was higher than of those receiving surgical treatment. Patients replying to the QOL survey in 2001 (n = 1099) had recovered well: 97 % of those lived at home. More than half considered their QOL as good or extremely good, 40 % as satisfactory and 7 % as bad. All QOL indexes of those of working-age were considerably lower (p < 0.001) than comparable figures of the age- and gender-adjusted Finnish population. The 5-year monitoring period made evident that mental recovery was slower than physical recovery. 2) The results of RAND-36 and EQ-5D correlated well (p < 0.01). The RAND-36 profile measure distinguished more clearly between the different categories of QOL and their levels. EQ-5D measured well the patient groups general QOL and the sum index was used to calculate QALY units. 3) QALY units were calculated by multiplying the time the patient survived after ICU stay or expected life-years by the EQ-5D sum index. Aging automatically lowers the number of QALY units. Patients under the age of 65 receiving conservative treatment benefited from treatment to a greater extent measured in QALY units than their peers receiving surgical treatment, but in the age group 65 and over patients with surgical treatment received higher QALY ratings than recipients of conservative treatment. 4) The intensive care experience and QOL ratings were connected. The QOL indices were statistically highest for those recipients with memories of intensive care as a positive experience, albeit their illness requiring intensive care treatment was less serious than average. No statistically significant differences were found in the QOL indices of those with negative memories, no memories or those who did not express the quality of their experiences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

MEG directly measures the neuronal events and has greater temporal resolution than fMRI, which has limited temporal resolution mainly due to the larger timescale of the hemodynamic response. On the other hand fMRI has advantages in spatial resolution, while the localization results with MEG can be ambiguous due to the non-uniqueness of the electromagnetic inverse problem. Thus, these methods could provide complementary information and could be used to create both spatially and temporally accurate models of brain function. We investigated the degree of overlap, revealed by the two imaging methods, in areas involved in sensory or motor processing in healthy subjects and neurosurgical patients. Furthermore, we used the spatial information from fMRI to construct a spatiotemporal model of the MEG data in order to investigate the sensorimotor system and to create a spatiotemporal model of its function. We compared the localization results from the MEG and fMRI with invasive electrophysiological cortical mapping. We used a recently introduced method, contextual clustering, for hypothesis testing of fMRI data and assessed the the effect of neighbourhood information use on the reproducibility of fMRI results. Using MEG, we identified the ipsilateral primary sensorimotor cortex (SMI) as a novel source area contributing to the somatosensory evoked fields (SEF) to median nerve stimulation. Using combined MEG and fMRI measurements we found that two separate areas in the lateral fissure may be the generators for the SEF responses from the secondary somatosensory cortex region. The two imaging methods indicated activation in corresponding locations. By using complementary information from MEG and fMRI we established a spatiotemporal model of somatosensory cortical processing. This spatiotemporal model of cerebral activity was in good agreement with results from several studies using invasive electrophysiological measurements and with anatomical studies in monkey and man concerning the connections between somatosensory areas. In neurosurgical patients, the MEG dipole model turned out to be more reliable than fMRI in the identification of the central sulcus. This was due to prominent activation in non-primary areas in fMRI, which in some cases led to erroneous or ambiguous localization of the central sulcus.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A promotional brochure celebrating the completion of the Seagram Building in spring 1957 features on its cover intense portraits of seven men bisected by a single line of bold text that asks, “Who are these Men?” The answer appears on the next page: “They Dreamed of a Tower of Light” (Figures 1, 2). Each photograph is reproduced with the respective man’s name and project credit: architects, Mies van der Rohe and Philip Johnson; associate architect, Eli Jacques Kahn; electrical contractor, Harry F. Fischbach; lighting consultant, Richard Kelly; and electrical engineer, Clifton E. Smith. To the right, a rendering of the new Seagram Tower anchors the composition, standing luminous against a star-speckled night sky; its glass walls and bronze mullions are transformed into a gossamer skin that reveals the tower’s structural skeleton. Lightolier, the contract lighting manufacturer, produced the brochure to promote its role in the lighting of the Seagram Building, but Lightolier’s promotional copy was not far from the truth.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Laser mediated stimulation of biological process was amongst its very first effects documented by Mester et al. but the ambiguous and tissue-cell context specific biological effects of laser radiation is now termed ‘Photobiomodulation’. We found many parallels between the reported biological effects of lasers and a multiface-ted growth factor, Transforming Growth Factor-β (TGF-β). This review outlines the interestingparallelsbetween the twofieldsand our rationalefor pursuingtheir potential causal correlation. We explored this correlation using an in vitro assay systems and a human clinical trial on healing wound extraction sockets that we reported in a recent publication. In conclusion we report that low power laser irradiation can activate latent TGF-β1 and β3 complexes and suggest that this might be one of the major modes of the photobiomodulatory effects of low power lasers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The structure and the mechanical properties of wood of Norway spruce (Picea abies [L.] Karst.) were studied using small samples from Finland and Sweden. X-ray diffraction (XRD) was used to determine the orientation of cellulose microfibrils (microfibril angle, MFA), the dimensions of cellulose crystallites and the average shape of the cell cross-section. X-ray attenuation and x-ray fluorescence measurements were used to study the chemical composition and the trace element content. Tensile testing with in situ XRD was used to characterise the mechanical properties of wood and the deformation of crystalline cellulose within the wood cell walls. Cellulose crystallites were found to be 192 284 Å long and 28.9 33.4 Å wide in chemically untreated wood and they were longer and wider in mature wood than in juvenile wood. The MFA distribution of individual Norway spruce tracheids and larger samples was asymmetric. In individual cell walls, the mean MFA was 19 30 degrees, while the mode of the MFA distribution was 7 21 degrees. Both the mean MFA and the mode of the MFA distribution decreased as a function of the annual ring. Tangential cell walls exhibited smaller mean MFA and mode of the MFA distribution than radial cell walls. Maceration of wood material caused narrowing of the MFA distribution and removed contributions observed at around 90 degrees. In wood of both untreated and fertilised trees, the average shape of the cell cross-section changed from circular via ambiguous to rectangular as the cambial age increased. The average shape of the cell cross-section and the MFA distribution did not change as a result of fertilisation. The mass absorption coefficient for x-rays was higher in wood of fertilised trees than in that of untreated trees and wood of fertilised trees contained more of the elements S, Cl, and K, but a smaller amount of Mn. Cellulose crystallites were longer in wood of fertilised trees than in that of untreated trees. Kraft cooking caused widening and shortening of the cellulose crystallites. Tensile tests parallel to the cells showed that if the mean MFA is initially around 10 degrees or smaller, no systematic changes occur in the MFA distribution due to strain. The role of mean MFA in defining the tensile strength or the modulus of elasticity of wood was not as dominant as that reported earlier. Crystalline cellulose elongated much less than the entire samples. The Poisson ratio νca of crystalline cellulose in Norway spruce wood was shown to be largely dependent on the surroundings of crystalline cellulose in the cell wall, varying between -1.2 and 0.8. The Poisson ratio was negative in kraft cooked wood and positive in chemically untreated wood. In chemically untreated wood, νca was larger in mature wood and in latewood compared to juvenile wood and earlywood.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fundamental Tax Legislation 2016 contains the essential provisions from the primary legislation that affects Australia's taxation system. Updated and expanded for changes which occurred in 2015, this volume is an indispensable reference for undergraduate and postgraduate students of taxation. The Year in Review section has been updated to summarise the main legislative developments in taxation over the previous 12 months, a listing of the passage of tax related legislation during the last year and the inclusion of reference statistics (such as CPI quarterly figures and individual tax rates for residents and foreign residents). Also fully updated and revised to reflect the changes in 2015 is the Tax Rates and Tables section, which contains an accessible summary of the main tax rates and tables that students will need to refer to for their tax studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the post-World War II era human rights have emerged as an enormous global phenomenon. In Finland human rights have particularly in the 1990s moved from the periphery to the center of public policy making and political rhetoric. Human rights education is commonly viewed as the decisive vehicle for emancipating individuals of oppressive societal structures and rendering them conscious of the equal value of others; both core ideals of the abstract discourse. Yet little empirical research has been conducted on how these goals are realized in practice. These factors provide the background for the present study which, by combining anthropological insights with critical legal theory, has analyzed the educational activities of a Scandinavian and Nordic network of human rights experts and PhD students in 2002-2005. This material has been complemented by data from the proceedings of UN human rights treaty bodies, hearings organized by the Finnish Foreign Ministry, the analysis of different human rights documents as well as the manner human rights are talked of in the Finnish media. As the human rights phenomenon has expanded, human rights experts have acquired widespread societal influence. The content of human rights remains, nevertheless, ambiguous: on the one hand they are law, on the other, part of a moral discourse. By educating laymen on what human rights are, experts act both as intermediaries and activists who expand the scope of rights and simultaneously exert increasing political influence. In the educational activities of the analyzed network these roles were visible in the rhetorics of legality and legitimacy . Among experts both of these rhetorics are subject to ongoing professional controversy, yet in the network they are presented as undisputable facts. This contributes to the impression that human rights knowledge is uncontested. This study demonstrates how the network s activities embody and strengthen a conception of expertise as located in specific, structurally determined individuals. Simultaneously its conception of learning emphasizes the adoption of knowledge by students, emphasizing the power of experts over them. The majority of the network s experts are Nordic males, whereas its students are predominantly Nordic females and males from East-European and developing countries. Contrary to the ideals of the discourse the network s activities do not create dialogue, but instead repeat power structures which are themselves problematic.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Seismic passive earth pressure coefficients were computed by the method of limit equilibrium using a pseudostatic approach for seismic forces. Composite curved rupture surfaces were considered in the analysis. While earlier studies using this type of analysis were mainly for sands, seismic passive earth pressure coefficients were obtained in the present study considering the effects of cohesion, surcharge, and own weight. The minimum seismic passive force was obtained by adding the individual minimum values of these components and the validity of the principle of superposition was examined. Other parameters considered in the analysis were wall batter angle, ground surface slope, soil friction angle, wall friction angle, wall adhesion to soil cohesion ratio, and horizontal and vertical seismic accelerations. The seismic earth pressure coefficients were found to be highly sensitive to the seismic acceleration coefficients both in the horizontal and vertical directions. Results of the study are presented in the form of figures and tables. Comparisons of the proposed method with available theories in the seismic case are also presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In line with cultural psychology and developmental theory, a single case approach is applied to construct knowledge on how children s interaction emerge interlinked to historical, social, cultural, and material context. The study focuses on the negotiation of constraints and meaning construction among 2-to 3-year-old children, a preschool teacher, and the researcher in settings with water. Water as an element offers a special case of cultural canalization: adults selectively monitor and guide children s access to it. The work follows the socio-cultural tradition in psychology, particularly the co-constructivist theory of human development and the Network of Meanings perspective developed at the University of São Paulo. Valsiner s concepts of Zone of Free Movement and Zone of Promoted Action are applied together with studies where interactions are seen as spaces of construction where negotiation of constraints for actions, emotions, and conceptions occur. The corpus was derived at a Finnish municipal day care centre. During a seven months period, children s actions were video recorded in small groups twice a month. The teacher and the researcher were present. Four sessions with two children were chosen for qualitative microanalysis; the analysis also addressed the transformations during the months covered by the study. Moreover, the data derivation was analyzed reflectively. The narrowed down arenas for actions were continuously negotiated among the participants both nonverbally and verbally. The adults expectations and intentions were materialized in the arrangements of the setting canalizing the possibilities for actions. Children s co-regulated actions emerged in relation to the adults presence, re-structuring attempts, and the constraints of the setting. Children co-constructed novel movements and meanings in relation to the initiatives and objects offered. Gestures, postures, and verbalizations emerged from the initially random movements and became constructed to have specific meanings and functions; meaning construction became abbreviated. The participants attempted to make sense of the ambiguous (explicit and implicit) intentions and fuzzy boundaries of promoted and possible actions: individualized yet overlapping features were continuously negotiated by all the participants. Throughout the months, children s actions increasingly corresponded adults (re-defined) conceptions of water researchers as an emerging group culture. Water became an instrument and a context for co-regulations. The study contributes to discussions on children as participants in cultural canalization and emphasizes the need for analysis in early childhood education practices on the implicit and explicit constraint structures for actions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The doctoral dissertation Critic Einari J. Vehmas and Modern Art deals with one of the central figures of the Finnish art scene and his work as an art critic, art museum curator and cultural critic. The main body of research material consists of the writings of Einari J. Vehmas (1902 1980) from 1937 to the late 1960s. Vehmas wrote art reviews for magazines, and from the year 1945 he was a regular art critic for one of the major newspapers in Finland. Vehmas was heavily inclined towards French literature and visual arts. Marcel Proust and Charles Baudelaire influenced his views on the nature of art from the late 1920s onwards. Vehmas is commonly regarded as the most influential art critic of post-war Finland. His writings have been referred to and cited in numerous research papers on Finnish 20th-century art. A lesser known aspect of his work is his position as the deputy director of the Ateneum Art Museum, the Finnish national gallery. Through his art museum work, his opinions also shaped the canon of modern art considered particularly Finnish following the second world war. The main emphasis of the dissertation is on studying Vehmas s writings, but it also illustrates the diversity of his involvement in Finnish cultural life through biographical documents. The long chronological span of the dissertation emphasises how certain central themes accumulate in Vehmas s writings. The aim of the dissertation is also to show how strongly certain philosophical and theoretical concepts from the early 20th century, specifically Wassily Kandinsky s principle of inner necessity and Henri Bergson s epistemology highlighting intuition and instinct, continued to influence the Finnish art discourse even in the early 1960s, in part thanks to the writings of Vehmas. Throughout his production, Vehmas contemplated the state and future of modern art and humanity. Vehmas used a colourful, vitalistic rhetoric to emphasise the role of modern art as a building block of culture and humanity. At the same time, however, he was a cultural pessimist whose art views became infused with anxiety, a sense of loss, and a desire to turn his back on the world.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The study seeks to find out whether the real burden of the personal taxation has increased or decreased. In order to determine this, we investigate how the same real income has been taxed in different years. Whenever the taxes for the same real income for a given year are higher than for the base year, the real tax burden has increased. If they are lower, the real tax burden has decreased. The study thus seeks to estimate how changes in the tax regulations affect the real tax burden. It should be kept in mind that the progression in the central government income tax schedule ensures that a real change in income will bring about a change in the tax ration. In case of inflation when the tax schedules are kept nominally the same will also increase the real tax burden. In calculations of the study it is assumed that the real income remains constant, so that we can get an unbiased measure of the effects of governmental actions in real terms. The main factors influencing the amount of income taxes an individual must pay are as follows: - Gross income (income subject to central and local government taxes). - Deductions from gross income and taxes calculated according to tax schedules. - The central government income tax schedule (progressive income taxation). - The rates for the local taxes and for social security payments (proportional taxation). In the study we investigate how much a certain group of taxpayers would have paid in taxes according to the actual tax regulations prevailing indifferent years if the income were kept constant in real terms. Other factors affecting tax liability are kept strictly unchanged (as constants). The resulting taxes, expressed in fixed prices, are then compared to the taxes levied in the base year (hypothetical taxation). The question we are addressing is thus how much taxes a certain group of taxpayers with the same socioeconomic characteristics would have paid on the same real income according to the actual tax regulations prevailing in different years. This has been suggested as the main way to measure real changes in taxation, although there are several alternative measures with essentially the same aim. Next an aggregate indicator of changes in income tax rates is constructed. It is designed to show how much the taxation of income has increased or reduced from one year to next year on average. The main question remains: How aggregation over all income levels should be performed? In order to determine the average real changes in the tax scales the difference functions (difference between actual and hypothetical taxation functions) were aggregated using taxable income as weights. Besides the difference functions, the relative changes in real taxes can be used as indicators of change. In this case the ratio between the taxes computed according to the new and the old situation indicates whether the taxation has become heavier or easier. The relative changes in tax scales can be described in a way similar to that used in describing the cost of living, or by means of price indices. For example, we can use Laspeyres´ price index formula for computing the ratio between taxes determined by the new tax scales and the old tax scales. The formula answers the question: How much more or less will be paid in taxes according to the new tax scales than according to the old ones when the real income situation corresponds to the old situation. In real terms the central government tax burden experienced a steady decline from its high post-war level up until the mid-1950s. The real tax burden then drifted upwards until the mid-1970s. The real level of taxation in 1975 was twice that of 1961. In the 1980s there was a steady phase due to the inflation corrections of tax schedules. In 1989 the tax schedule fell drastically and from the mid-1990s tax schedules have decreased the real tax burden significantly. Local tax rates have risen continuously from 10 percent in 1948 to nearly 19 percent in 2008. Deductions have lowered the real tax burden especially in recent years. Aggregate figures indicate how the tax ratio for the same real income has changed over the years according to the prevailing tax regulations. We call the tax ratio calculated in this manner the real income tax ratio. A change in the real income tax ratio depicts an increase or decrease in the real tax burden. The real income tax ratio declined after the war for some years. In the beginning of the 1960s it nearly doubled to mid-1970. From mid-1990s the real income tax ratio has fallen about 35 %.