898 resultados para Election Counting and Reporting Software,


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Developing software is a difficult and error-prone activity. Furthermore, the complexity of modern computer applications is significant. Hence,an organised approach to software construction is crucial. Stepwise Feature Introduction – created by R.-J. Back – is a development paradigm, in which software is constructed by adding functionality in small increments. The resulting code has an organised, layered structure and can be easily reused. Moreover, the interaction with the users of the software and the correctness concerns are essential elements of the development process, contributing to high quality and functionality of the final product. The paradigm of Stepwise Feature Introduction has been successfully applied in an academic environment, to a number of small-scale developments. The thesis examines the paradigm and its suitability to construction of large and complex software systems by focusing on the development of two software systems of significant complexity. Throughout the thesis we propose a number of improvements and modifications that should be applied to the paradigm when developing or reengineering large and complex software systems. The discussion in the thesis covers various aspects of software development that relate to Stepwise Feature Introduction. More specifically, we evaluate the paradigm based on the common practices of object-oriented programming and design and agile development methodologies. We also outline the strategy to testing systems built with the paradigm of Stepwise Feature Introduction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Because of the increased availability of different kind of business intelligence technologies and tools it can be easy to fall in illusion that new technologies will automatically solve the problems of data management and reporting of the company. The management is not only about management of technology but also the management of processes and people. This thesis is focusing more into traditional data management and performance management of production processes which both can be seen as a requirement for long lasting development. Also some of the operative BI solutions are considered in the ideal state of reporting system. The objectives of this study are to examine what requirements effective performance management of production processes have for data management and reporting of the company and to see how they are effecting on the efficiency of it. The research is executed as a theoretical literary research about the subjects and as a qualitative case study about reporting development project of Finnsugar Ltd. The case study is examined through theoretical frameworks and by the active participant observation. To get a better picture about the ideal state of reporting system simple investment calculations are performed. According to the results of the research, requirements for effective performance management of production processes are automation in the collection of data, integration of operative databases, usage of efficient data management technologies like ETL (Extract, Transform, Load) processes, data warehouse (DW) and Online Analytical Processing (OLAP) and efficient management of processes, data and roles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An empirical study was conducted in the area of software engineering to study relationships between development, testing and intended software quality. International standards served as a starting point of the study. For analysis a round of interviews was kept and transcribed. It was found that interaction between humans is critical, especially in transferring knowledge and standards’ processes. The standards are communicated through interaction and learning processes are involved before compliance. One of the results was that testing is the key to sufficient quality. The outcome was that successful interaction, sufficient testing and compliance with the standards combined with good motivation may provide most repeatable intended quality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pro gradu –tutkielman ensisijaisena tavoitteena on tutkia sähköisen ta-loushallinnon kehitystä ja miten se on näkynyt alan suomalaisessa ja kansainvälisessä ammattilehtikirjoittelussa vuosina 1997 – 2013. Tarkoi-tuksena on löytää kirjoittelun perusteella sähköisen taloushallinnon kehi-tyksen edistäviä ja hidastavia tekijöitä sekä minkälaisia tulevaisuuden näkymiä artikkelit luovat sähköiselle taloushallinnolle. Lisäksi tavoitteena on löytää yhtäläisyyksiä ja eroja kansallisen ja kansainvälisen kirjoittelun välillä. Tutkielma on laadullinen tutkimus ja tutkimusmenetelminä käytetään si-sällönanalyysia, sisällönerittelyä, teemoittelua ja vertailevaa tutkimusta. Tutkielman empiirinen aineisto koostuu Tilisanomien, Balanssi–lehden ja Accountancy–lehden sähköistä taloushallintoa koskevista artikkeleista aikavälillä 1997 – 2013. Tutkimustulosten perusteella sähköisen taloushallinnon kehityskulku näkyy myös ammattilehtikirjoittelussa. Tietojärjestelmät ja niiden käytön mahdollistava lainsäädäntö ovat sähköisen taloushallinnon perusedelly-tykset. Suurimpia kehityksen edistäjiä ovat julkinen valta, verkkolaskut ja standardointi. Kehityksen hidastajina nähdään yhtenäisten standardien puute ja asenteet. Sähköisen taloushallinnon tulevaisuuden näkymiä ovat standardoidut prosessit ja käsitteet koko taloushallinnon alueella sekä viranomaisraportointi XBRL-kielen avulla. Suurimmat erot kansalli-sessa ja kansainvälisessä kehityksessä on havaittavissa verkkolaskuissa ja XBRL-raportoinnissa. Johtopäätöksenä todetaan että kirjoittelun perusteella voidaan löytää sähköisen taloushallinnon kehitystä hidastavia ja edistäviä tekijöitä ja se luo viitteitä tulevaisuuden kehitykselle. Sähköinen taloushallinto kehittyy kunkin maan valtiovallan tahtotilan ja lainsäädännön mukaan. Jatkotut-kimuskohteena tutkielmaa voisi laajentaa kansainvälisemmäksi ottamalla mukaan tarkasteluun useampia kansainvälisiä lehtiä ja kansainvälistä lainsäädäntöä.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Biokuvainformatiikan kehittäminen – mikroskopiasta ohjelmistoratkaisuihin – sovellusesimerkkinä α2β1-integriini Kun ihmisen genomi saatiin sekvensoitua vuonna 2003, biotieteiden päätehtäväksi tuli selvittää eri geenien tehtävät, ja erilaisista biokuvantamistekniikoista tuli keskeisiä tutkimusmenetelmiä. Teknologiset kehitysaskeleet johtivat erityisesti fluoresenssipohjaisten valomikroskopiatekniikoiden suosion räjähdysmäiseen kasvuun, mutta mikroskopian tuli muuntua kvalitatiivisesta tieteestä kvantitatiiviseksi. Tämä muutos synnytti uuden tieteenalan, biokuvainformatiikan, jonka on sanottu mahdollisesti mullistavan biotieteet. Tämä väitöskirja esittelee laajan, poikkitieteellisen työkokonaisuuden biokuvainformatiikan alalta. Väitöskirjan ensimmäinen tavoite oli kehittää protokollia elävien solujen neliulotteiseen konfokaalimikroskopiaan, joka oli yksi nopeimmin kasvavista biokuvantamismenetelmistä. Ihmisen kollageenireseptori α2β1-integriini, joka on tärkeä molekyyli monissa fysiologisissa ja patologisissa prosesseissa, oli sovellusesimerkkinä. Työssä saavutettiin selkeitä visualisointeja integriinien liikkeistä, yhteenkeräytymisestä ja solun sisään siirtymisestä, mutta työkaluja kuvainformaation kvantitatiiviseen analysointiin ei ollut. Väitöskirjan toiseksi tavoitteeksi tulikin tällaiseen analysointiin soveltuvan tietokoneohjelmiston kehittäminen. Samaan aikaan syntyi biokuvainformatiikka, ja kipeimmin uudella alalla kaivattiin erikoistuneita tietokoneohjelmistoja. Tämän väitöskirjatyön tärkeimmäksi tulokseksi muodostui näin ollen BioImageXD, uudenlainen avoimen lähdekoodin ohjelmisto moniulotteisten biokuvien visualisointiin, prosessointiin ja analysointiin. BioImageXD kasvoi yhdeksi alansa suurimmista ja monipuolisimmista. Se julkaistiin Nature Methods -lehden biokuvainformatiikkaa käsittelevässä erikoisnumerossa, ja siitä tuli tunnettu ja laajalti käytetty. Väitöskirjan kolmas tavoite oli soveltaa kehitettyjä menetelmiä johonkin käytännönläheisempään. Tehtiin keinotekoisia piidioksidinanopartikkeleita, joissa oli "osoitelappuina" α2β1-integriinin tunnistavia vasta-aineita. BioImageXD:n avulla osoitettiin, että nanopartikkeleilla on potentiaalia lääkkeiden täsmäohjaussovelluksissa. Tämän väitöskirjatyön yksi perimmäinen tavoite oli edistää uutta ja tuntematonta biokuvainformatiikan tieteenalaa, ja tämä tavoite saavutettiin erityisesti BioImageXD:n ja sen lukuisten julkaistujen sovellusten kautta. Väitöskirjatyöllä on merkittävää potentiaalia tulevaisuudessa, mutta biokuvainformatiikalla on vakavia haasteita. Ala on liian monimutkainen keskimääräisen biolääketieteen tutkijan hallittavaksi, ja alan keskeisin elementti, avoimen lähdekoodin ohjelmistokehitystyö, on aliarvostettu. Näihin seikkoihin tarvitaan useita parannuksia,

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Amazonian region, the biggest rain forest of our planet, is known for its extraordinary biodiversity. Most of this diversity is still unexplored and new species of different taxa are regularly found there. In this region, as in most areas of the world, insects are some of the most abundant organisms. Therefore, studying this group is important to promote the conservation of these highly biodiverse ecosystems of the planet. Among insects, parasitoid wasps are especially interesting because they have potential for use as biodiversity indicators and biological control agents in agriculture and forestry. The parasitoid wasp family Ichneumonidae is one of the most species rich groups among the kingdom Animalia. This group is still poorly known in many areas of the world; the Amazonian region is a clear example of this situation. Ichneumonids have been thought to be species poor in Amazonia and other tropical areas. However, recent studies are suggesting that parasitoid wasps may be quite abundant in Amazonia and possibly in most tropical areas of the world. The aim of my doctoral thesis is to study the species richness and taxonomy of two of the best known ichneumonid subfamilies in the Neotropical region, Pimplinae and Rhyssinae. To do this I conducted two extensive sampling programs in the Peruvian Amazonia. I examined also a large number of Neotropical ichneumonids deposited to different natural history museums. According to the results of my thesis, the species richness of these parasitoids in the Amazonian region is considerably higher than previously reported. In my research, I firstly further develop the taxonomy of these parasitoids by describing many new species and reporting several new faunistic records (I, II, III). In this first part I focus on two genera (Xanthopimpla and Epirhyssa) which were thought to be rather species poor. My thesis demonstrates that these groups are actually rather species rich in the Amazonian region. Secondly, I concentrate on the species richness of these parasitoids in a global comparison showing that the Neotropical region and especially the Peruvian Amazonia is one of the most species rich areas of Pimpliformes ichneumonids (V). Furthermore, I demonstrate that with the data available to date no clear latitudinal gradient in species richness is visible. Thirdly, increasing the macroecological knowledge of these parasitoids I show that some previously unreported ichneumonid subfamilies are present in the Amazonian region (IV). These new insights and the results of the global comparison of ichneumonid inventories suggest that the previous belief of low diversity in the tropics is most likely related to a lack of sampling effort in the region. Overall, my research increases the knowledge of Neotropical ichneumonids highlighting the importance of Peruvian Amazonia as one of the diversity hotspots of parasitoid wasps.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The consumption of psychotropic drugs among Brazilian secondary school students was examined by comparing data from four surveys using a questionnaire adapted from the WHO's Program on Research and Reporting on the Epidemiology of Drug Dependence. Students filled out the form in their classrooms without the presence of teachers. The target population consisted of 10-18-year-old students (on average, 15,000 students responded to each survey) in Brazil's ten largest state capitals: Belém, Belo Horizonte, Brasília, Curitiba, Fortaleza, Porto Alegre, Recife, Rio de Janeiro, Salvador, and São Paulo. Among the legal drugs, lifetime use (use at least once during life) of tobacco was increased in seven cities (the exceptions were Brasília, Porto Alegre and Rio de Janeiro). There was also a significant increase in frequent use of alcohol (six times or more per month) in 6 of the cities, from an average of 9.2% in 1987 to 15.0% in 1997. With respect to illegal drugs, there was a significant increase in lifetime use of marijuana (a 3-fold increase from 2.8% in 1987 to 7.6% in 1997). Cocaine use increased 4-fold over the survey period (0.5% in 1987 to 2.0% in 1997). Lifetime use of cocaine significantly increased in eight capitals (except Recife and Rio de Janeiro). However, frequent cocaine use increased in only three capitals (Belém, Fortaleza and Porto Alegre), from an average of 1.0% in 1987 to 3.6% in 1997. Lifetime use of medications such as anxiolytics and amphetamines increased 2-fold on average over the survey period. Comparing the four studies, the main conclusion is that there were significant increases in the frequencies for lifetime use, frequent use and heavy use of many drugs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to various advantages such as flexibility, scalability and updatability, software intensive systems are increasingly embedded in everyday life. The constantly growing number of functions executed by these systems requires a high level of performance from the underlying platform. The main approach to incrementing performance has been the increase of operating frequency of a chip. However, this has led to the problem of power dissipation, which has shifted the focus of research to parallel and distributed computing. Parallel many-core platforms can provide the required level of computational power along with low power consumption. On the one hand, this enables parallel execution of highly intensive applications. With their computational power, these platforms are likely to be used in various application domains: from home use electronics (e.g., video processing) to complex critical control systems. On the other hand, the utilization of the resources has to be efficient in terms of performance and power consumption. However, the high level of on-chip integration results in the increase of the probability of various faults and creation of hotspots leading to thermal problems. Additionally, radiation, which is frequent in space but becomes an issue also at the ground level, can cause transient faults. This can eventually induce a faulty execution of applications. Therefore, it is crucial to develop methods that enable efficient as well as resilient execution of applications. The main objective of the thesis is to propose an approach to design agentbased systems for many-core platforms in a rigorous manner. When designing such a system, we explore and integrate various dynamic reconfiguration mechanisms into agents functionality. The use of these mechanisms enhances resilience of the underlying platform whilst maintaining performance at an acceptable level. The design of the system proceeds according to a formal refinement approach which allows us to ensure correct behaviour of the system with respect to postulated properties. To enable analysis of the proposed system in terms of area overhead as well as performance, we explore an approach, where the developed rigorous models are transformed into a high-level implementation language. Specifically, we investigate methods for deriving fault-free implementations from these models into, e.g., a hardware description language, namely VHDL.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ricinus communis L. is of great economic importance due to the oil extracted from its seeds. Castor oil has been used for pharmaceutical and industrial applications, as a lubricant or coating agent, as a component of plastic products, as a fungicide or in the synthesis of biodiesel fuels. After oil extraction, a castor cake with a large amount of protein is obtained. However, this by-product cannot be used as animal feed due to the presence of toxic (ricin) and allergenic (2S albumin) proteins. Here, we propose two processes for detoxification and allergen inactivation of the castor cake. In addition, we establish a biological test to detect ricin and validate these detoxification processes. In this test, Vero cells were treated with ricin, and cell death was assessed by cell counting and measurement of lactate dehydrogenase activity. The limit of detection of the Vero cell assay was 10 ng/mL using a concentration of 1.6 x 10(5) cells/well. Solid-state fermentation (SSF) and treatment with calcium compounds were used as cake detoxification processes. For SSF, Aspergillus niger was grown using a castor cake as a substrate, and this cake was analyzed after 24, 48, 72, and 96 h of SSF. Ricin was eliminated after 24 h of SSF treatment. The cake was treated with 4 or 8% Ca(OH)2 or CaO, and both the toxicity and the allergenic properties were entirely abolished. A by-product free of toxicity and allergens was obtained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to find out how a software company can successfully expand business to the Danish software market through distribution channel. The study was commissioned by a Finnish software company and it was conducted using a qualitative research method by analyzing external and internal business environment, and interviewing Danish ICT organizations and M-Files personnel. Interviews were semi-structured interviews, which were designed to collect comprehensive information on the existing ICT and software market in Denmark. The research used three external and internal analyzing frameworks; PEST analysis (market level), Porter´s Five Force analysis (industry level competition) and SWOT analysis (company level). Distribution channels theory was a base to understand why and what kind of distribution channels the case company uses, and what kind of channels target markets companies’ uses. Channel strategy and design were integrated to the industry level analysis. The empirical findings revealed that Denmark has very business friendly ICT environment. Several organizations have ranked Denmark´s information and communication technology as the best in the world. Denmark’s ICT and software market are relatively small, compared to many other countries in Europe. Danish software market is centralized. Largest software clusters are in the largest cities; Copenhagen, Aarhus, Odense and Aalborg. From these clusters, software companies can most likely find suitable resellers. The following growing trends are clearly seen in the software market: mobile and wireless applications, outsourcing, security solutions, cloud computing, social business solutions and e-business solutions. When expanding software business to the Danish market, it is important to take into account these trends. In Denmark distribution channels varies depending on the product or service. For many, a natural distribution channel is a local partner or internet. In the public sector solutions are purchased through a public procurement process. In the private sector the buying process is more straight forwarded. Danish companies are buying software from reliable suppliers. This means that they usually buy software direct from big software vendors or local partners. Some customers prefer to use professional consulting companies. These consulting companies can strongly influence on the selection of the supplier and products, and in this light, consulting companies can be important partners for software companies. Even though the competition is fierce in ECM and DMS solutions, Danish market offers opportunities for foreign companies. Penetration to the Danish market through reseller channel requires advanced solutions and objective selection criteria for channel partners. Based on the findings, Danish companies are interested in advanced and efficient software solutions. Interest towards M-Files solutions was clearly seen and the company has excellent opportunity to expand business to the Danish market through reseller channel. Since the research explored the Danish ICT and software market, the results of the study may offer valuable information also to the other software companies which are expanding their business to the Danish market.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sales and operations research publications have increased significantly in the last decades. The concept of sales and operations planning (S&OP) has gained increased recognition and has been put forward as the area within Supply Chain Management (SCM). Development of S&OP is based on the need for determining future actions, both for sales and operations, since off-shoring, outsourcing, complex supply chains and extended lead times make challenges for responding to changes in the marketplace when they occur. Order intake of the case company has grown rapidly during the last years. Along with the growth, new challenges considering data management and information flow have arisen due to increasing customer orders. To manage these challenges, case company has implemented S&OP process, though initial process is in early stage and due to this, the process is not managing the increased customer orders adequately. Thesis objective is to explore extensively the S&OP process content of the case company and give further recommendations. Objectives are categorized into six different groups, to clarify the purpose of this thesis. Qualitative research methods used are active participant observation, qualitative interviews, enquiry, education, and a workshop. It is notable that demand planning was felt as cumbersome, so it is typically the biggest challenge in S&OP process. More proactive the sales forecasting can be, more expanded the time horizon of operational planning will turn out. S&OP process is 60 percent change management, 30 percent process development and 10 percent technology. The change management and continuous improvement can sometimes be arduous and set as secondary. It is important that different people are required to improve the process and the process is constantly evaluated. As well as, process governance is substantially in a central role and it has to be managed consciously. Generally, S&OP process was seen important and all the stakeholders were committed to the process. Particular sections were experienced more important than others, depending on the stakeholders’ point of views. Recommendations to objective groups are evaluated by the achievable benefit and resource requirement. The urgent and easily implemented improvement recommendations should be executed firstly. Next steps are to develop more coherent process structure and refine cost awareness. Afterwards demand planning, supply planning, and reporting should be developed more profoundly. For last, information technology system should be implemented to support the process phases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ohjelmistotestauksen merkitys on kasvanut sen mukaan mitä enemmän ohjelmisto-tuotteet vaikuttavat jokapäiväisesseen elämämme. Tämän vuoksi yritysten investointien ja laadunvarmentamisen yhteys on ilmeinen. Organisaatiot panostavat yhä enemmän ei–funktionaaliseen testaukseen, kuten turvallisuuden, suorituskyvyn ja käytettävyyden testaamiseen. Tämän työn tarkoituksena on tutkia ohjelmistotestauksen nykytilannetta Suomessa. Syy tähän on uudistaa ja parantaa ohjelmistotestauksen kurssitarjontaa Turun yliopistossa vastaamaan parhaalla mahdollisella tavalla yritysten tarvetta. Opinnäyte on toteutettu replikaatio-tutkimuksena. Pääosa kyselystä sisältää kysymyksiä ohjelmistotestauksen menetelmistä ja työkaluista testausprosessin toimintojen aikana. Lisäksi on yleisiä kysymyksiä yrityksistä ja niiden ohjelmistotestausympäristöistä. Kyselyssä otetaan myös kantaa yritysten käyttämiin monenlaisiin testaus-tasoihin, -tyyppeihin ja testauksessa kohdattuihin haasteisiin. Tämä opinnäyte perustuu testausprosessistandardeihin. Ohjelmistotestausstandardit ovat keskeisessä asemassa tässä työssä, vaikka ne ovat olleet viime aikoina vahvan kritiikin kohteena. Epäilys standardien välttämättömyyteen on syntynyt muutoksista ohjelmistokehityksessä. Tämä työ esittelee tulokset ohjelmistotestauksen käytännöistä. Tuloksia on verrattu aiheeseen liittyvän aiemman kyselyn (Lee, Kang, & Lee, 2011) tuloksiin. Ajanpuutteen havaitaan olevan suuri haaste ohjelmistotestauksessa. Ketterä ohjelmistokehitys on saavuttanut suosiota kaikissa vastaajien yrityksissä. Testauksen menetelmät ja työkalut testauksen arviointiin, suunnitteluun ja raportointiin ovat hyvin vähäisessä käytössä. Toisaalta testauksen menetelmien ja työkalujen käyttö automaattiseen testauksen toteuttamiseen ja virheiden hallintaan on lisääntynyt. Järjestelmä-, hyväksyntä-, yksikkö- ja integraatiotestaus ovat käytössä kaikkien vastaajien edustamissa yrityksissä. Kaikkien vastaajien mielestä regressio- sekä tutkiva- ja ei-funktionaalinen testaus ovat tärkeitä tekniikoita.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract There are no precedents concerning the quality of Octopus maya during chilled storage. This study evaluated the shelf life of the red octopus in chilling storage (4oC) and the correlation of the sensory quality index with microbiological counting and the biochemical indicators (hypoxanthine, histamine and volatile amines). A total of 112 whole raw octopi (average weight of 896 g) were randomly selected from seven batches and exposed to 4°C for 18, 24, 48, 72, 84, 96, and 100 h. The histamine concentration (91.7%), followed by the counts of psychrotrophic bacteria (5.5%) and hypoxanthine (2.2%), were the predictors from the redundancy analysis that better explained the changes taking place during the chilling hours. After 72 h of chilling, the microbial count was determined to be log 4.7 CFU/g, and the octopus samples were classified as B quality (minor sensory quality defects) based on the sensory quality scale. Although the samples were not classified as unacceptable at 100 h of refrigeration by the sensory index, the level of histamine reached the defect action level (5 mg/100 g) as ruled by the International Food Safety Authorities. The shelf life of the red octopus in chilling storage was predicted to be 119 h.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many-core systems provide a great potential in application performance with the massively parallel structure. Such systems are currently being integrated into most parts of daily life from high-end server farms to desktop systems, laptops and mobile devices. Yet, these systems are facing increasing challenges such as high temperature causing physical damage, high electrical bills both for servers and individual users, unpleasant noise levels due to active cooling and unrealistic battery drainage in mobile devices; factors caused directly by poor energy efficiency. Power management has traditionally been an area of research providing hardware solutions or runtime power management in the operating system in form of frequency governors. Energy awareness in application software is currently non-existent. This means that applications are not involved in the power management decisions, nor does any interface between the applications and the runtime system to provide such facilities exist. Power management in the operating system is therefore performed purely based on indirect implications of software execution, usually referred to as the workload. It often results in over-allocation of resources, hence power waste. This thesis discusses power management strategies in many-core systems in the form of increasing application software awareness of energy efficiency. The presented approach allows meta-data descriptions in the applications and is manifested in two design recommendations: 1) Energy-aware mapping 2) Energy-aware execution which allow the applications to directly influence the power management decisions. The recommendations eliminate over-allocation of resources and increase the energy efficiency of the computing system. Both recommendations are fully supported in a provided interface in combination with a novel power management runtime system called Bricktop. The work presented in this thesis allows both new- and legacy software to execute with the most energy efficient mapping on a many-core CPU and with the most energy efficient performance level. A set of case study examples demonstrate realworld energy savings in a wide range of applications without performance degradation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study, finite element analyses and experimental tests are carried out in order to investigate the effect of loading type and symmetry on the fatigue strength of three different non-load carrying welded joints. The current codes and recommendations do not give explicit instructions how to consider degree of bending in loading and the effect of symmetry in the fatigue assessment of welded joints. The fatigue assessment is done by using effective notch stress method and linear elastic fracture mechanics. Transverse attachment and cover plate joints are analyzed by using 2D plane strain element models in FEMAP/NxNastran and Franc2D software and longitudinal gusset case is analyzed by using solid element models in Abaqus and Abaqus/XFEM software. By means of the evaluated effective notch stress range and stress intensity factor range, the nominal fatigue strength is assessed. Experimental tests consist of the fatigue tests of transverse attachment joints with total amount of 12 specimens. In the tests, the effect of both loading type and symmetry on the fatigue strength is studied. Finite element analyses showed that the fatigue strength of asymmetric joint is higher in tensile loading and the fatigue strength of symmetric joint is higher in bending loading in terms of nominal and hot spot stress methods. Linear elastic fracture mechanics indicated that bending reduces stress intensity factors when the crack size is relatively large since the normal stress decreases at the crack tip due to the stress gradient. Under tensile loading, experimental tests corresponded with finite element analyzes. Still, the fatigue tested joints subjected to bending showed the bending increased the fatigue strength of non-load carrying welded joints and the fatigue test results did not fully agree with the fatigue assessment. According to the results, it can be concluded that in tensile loading, the symmetry of joint distinctly affects on the fatigue strength. The fatigue life assessment of bending loaded joints is challenging since it depends on whether the crack initiation or propagation is predominant.