865 resultados para New Technologies
Resumo:
After more than twenty years of basic and applied research, the use of nanotechnology in the design and manufacture of nanoscale materials is rapidly increasing, particularly in commercial applications that span from electronics across renewable energy areas, and biomedical devices. Novel polymers are attracting significant attention for they promise to provide a low−cost high−performance alternative to existing materials. Furthermore, these polymers have the potential to overcome limitations imposed by currently available materials thus enabling the development of new technologies and applications that are currently beyond our reach. This work focuses on the development of a range of new low−cost environmentally−friendly polymer materials for applications in areas of organic (flexible) electronics, optics, and biomaterials. The choice of the monomer reflects the environmentally−conscious focus of this project. Terpinen−4−ol is a major constituent of Australian grown Melaleuca alternifolia (tea tree) oil, attributed with the oil's antimicrobial and anti−inflammatory properties. Plasma polymerisation was chosen as a deposition technique for it requires minimal use of harmful chemicals and produces no hazardous by−products. Polymer thin films were fabricated under varied process conditions to attain materials with distinct physico−chemical, optoelectrical, biological and degradation characteristics. The resultant materials, named polyterpenol, were extensively characterised using a number of well−accepted and novel techniques, and their fundamental properties were defined. Polyterpenol films were demonstrated to be hydrocarbon rich, with variable content of oxygen moieties, primarily in the form of hydroxyl and carboxyl functionalities. The level of preservation of original monomer functionality was shown to be strongly dependent on the deposition energy, with higher applied power increasing the molecular fragmentation and substrate temperature. Polyterpenol water contact angle contact angle increased from 62.7° for the 10 W samples to 76.3° for the films deposited at 100 W. Polymers were determined to resist solubilisation by water, due to the extensive intermolecular and intramolecular hydrogen bonds present, and other solvents commonly employed in electronics and biomedical processing. Independent of deposition power, the surface topography of the polymers was shown to be smooth (Rq <0.5 nm), uniform and defect free. Hardness of polyterpenol coatings increased from 0.33 GPa for 10 W to 0.51 GPa for 100 W (at 500 μN load). Coatings deposited at higher input RF powers showed less mechanical deformation during nanoscratch testing, with no considerable damage, cracking or delamination observed. Independent of the substrate, the quality of film adhesion improved with RF power, suggesting these coatings are likely to be more stable and less susceptible to wear. Independent of fabrication conditions, polyterpenol thin films were optically transparent, with refractive index approximating that of glass. Refractive index increased slightly with deposition power, from 1.54 (10 W) to 1.56 (100 W) at 500 nm. The optical band gap values declined with increasing power, from 2.95 eV to 2.64 eV, placing the material within the range for semiconductors. Introduction of iodine impurity reduced the band gap of polyterpenol, from 2.8 eV to 1.64 eV, by extending the density of states more into the visible region of the electromagnetic spectrum. Doping decreased the transparency and increased the refractive index from 1.54 to 1.70 (at 500 nm). At optical frequencies, the real part of permittivity (k) was determined to be between 2.34 and 2.65, indicating a potential low-k material. These permittivity values were confirmed at microwave frequencies, where permittivity increased with input RF energy – from 2.32 to 2.53 (at 10 GHz ) and from 2.65 to 2.83 (at 20 GHz). At low frequencies, the dielectric constant was determined from current−voltage characteristics of Al−polyterpenol−Al devices. At frequencies below 100 kHz, the dielectric constant varied with RF power, from 3.86 to 4.42 at 1 kHz. For all samples, the resistivity was in order of 10⁸−10⁹ _m (at 6 V), confirming the insulating nature of polyterpenol material. In situ iodine doping was demonstrated to increase the conductivity of polyterpenol, from 5.05 × 10⁻⁸ S/cm to 1.20 × 10⁻⁶ S/cm (at 20 V). Exposed to ambient conditions over extended period of time, polyterpenol thin films were demonstrated to be optically, physically and chemically stable. The bulk of ageing occurred within first 150 h after deposition and was attributed to oxidation and volumetric relaxation. Thermal ageing studies indicated thermal stability increased for the films manufactured at higher RF powers, with degradation onset temperature associated with weight loss shifting from 150 ºC to 205 ºC for 10 W and 100 W polyterpenol, respectively. Annealing the films to 405 °C resulted in full dissociation of the polymer, with minimal residue. Given the outcomes of the fundamental characterisation, a number of potential applications for polyterpenol have been identified. Flexibility, tunable permittivity and loss tangent properties of polyterpenol suggest the material can be used as an insulating layer in plastic electronics. Implementation of polyterpenol as a surface modification of the gate insulator in pentacene-based Field Effect Transistor resulted in significant improvements, shifting the threshold voltage from + 20 V to –3 V, enhancing the effective mobility from 0.012 to 0.021 cm²/Vs, and improving the switching property of the device from 10⁷ to 10⁴. Polyterpenol was demonstrated to have a hole transport electron blocking property, with potential applications in many organic devices, such as organic light emitting diodes. Encapsulation of biomedical devices is also proposed, given that under favourable conditions, the original chemical and biological functionality of terpinen−4−ol molecule can be preserved. Films deposited at low RF power were shown to successfully prevent adhesion and retention of several important human pathogens, including P. aeruginosa, S. aureus, and S. epidermidis, whereas films deposited at higher RF power promoted bacterial cell adhesion and biofilm formation. Preliminary investigations into in vitro biocompatibility of polyterpenol demonstrated the coating to be non−toxic for several types of eukaryotic cells, including Balb/c mice macrophage and human monocyte type (HTP−1 non-adherent) cells. Applied to magnesium substrates, polyterpenol encapsulating layer significantly slowed down in vitro biodegradation of the metal, thus increasing the viability and growth of HTP−1 cells. Recently, applied to varied nanostructured titanium surfaces, polyterpenol thin films successfully reduced attachment, growth, and viability of P. aeruginosa and S. aureus.
Resumo:
The availability and quality of irrigation water has become an issue limiting productivity in many Australian vegetable regions. Production is also under competitive pressure from supply chain forces. Producers look to new technologies, including changing irrigation infrastructure, exploring new water sources, and more complex irrigation management, to survive these stresses. Often there is little objective information investigating which improvements could improve outcomes for vegetable producers, and external communities (e.g. meeting NRM targets). This has led to investment in inappropriate technologies, and costly repetition of errors, as business independently discover the worth of technologies by personal experience. In our project, we investigated technology improvements for vegetable irrigation. Through engagement with industry and other researchers, we identified technologies most applicable to growers, particularly those that addressed priority issues. We developed analytical tools for ‘what if’ scenario testing of technologies. We conducted nine detailed experiments in the Lockyer Valley and Riverina vegetable growing districts, as well as case studies on grower properties in southern Queensland. We investigated root zone monitoring tools (FullStop™ wetting front detectors and Soil Solution Extraction Tubes - SSET), drip system layout, fertigation equipment, and altering planting arrangements. Our project team developed and validated models for broccoli, sweet corn, green beans and lettuce, and spreadsheets for evaluating economic risks associated with new technologies. We presented project outcomes at over 100 extension events, including irrigation showcases, conferences, field days, farm walks and workshops. The FullStops™ were excellent for monitoring root zone conditions (EC, nitrate levels), and managing irrigation with poor quality water. They were easier to interpret than the SSET. The SSET were simpler to install, but required wet soil to be reliable. SSET were an option for monitoring deeper soil zones, unsuitable for FullStop™ installations. Because these root zone tools require expertise, and are labour intensive, we recommend they be used to address specific problems, or as a periodic auditing strategy, not for routine monitoring. In our research, we routinely found high residual N in horticultural soils, with subsequently little crop yield response to additional nitrogen fertiliser. With improved irrigation efficiency (and less leaching), it may be timely to re-examine nitrogen budgets and recommendations for vegetable crops. Where the drip irrigation tube was located close to the crop row (i.e. within 5-8 cm), management of irrigation was easier. It improved nitrogen uptake, water use efficiency, and reduced the risk of poor crop performance through moisture stress, particularly in the early crop establishment phases. Close proximity of the drip tube to the crop row gives the producer more options for managing salty water, and more flexibility in taking risks with forecast rain. In many vegetable crops, proximate drip systems may not be cost-effective. The next best alternative is to push crop rows closer to the drip tube (leading to an asymmetric row structure). The vegetable crop models are good at predicting crop phenology (development stages, time to harvest), input use (water, fertiliser), environmental impacts (nutrient, salt movement) and total yields. The two immediate applications for the models are understanding/predicting/manipulating harvest dates and nitrogen movements in vegetable cropping systems. From the economic tools, the major influences on accumulated profit are price and yield. In doing ‘what if’ analyses, it is very important to be as accurate as possible in ascertaining what the assumed yield and price ranges are. In most vegetable production systems, lowering the required inputs (e.g. irrigation requirement, fertiliser requirement) is unlikely to have a major influence on accumulated profit. However, if a resource is constraining (e.g. available irrigation water), it is usually most profitable to maximise return per unit of that resource.
Resumo:
The development of biotechnology techniques in plant breeding and the new commercial applications have raised public and scientific concerns about the safety of genetically modified (GM) crops and trees. To find out the feasibility of these new technologies in the breeding of commercially important Finnish hardwood species and to estimate the ecological risks of the produced transgenic plants, the experiments of this study have been conducted as a part of a larger project focusing on the risk assessment of GM-trees. Transgenic Betula pendula and Populus trees were produced via Agrobacterium mediated transformation. Stilbene synthase (STS) gene from pine (Pinus sylvestris) and chitinase gene from sugar beet (Beta vulgaris) were transferred to (hybrid) aspen and birch, respectively, to improve disease resistance against fungal pathogens. To modify lignin biosynthesis, a 4-coumarate:coenzyme A ligase (4CL) gene fragment in antisense orientation was introduced into two birch clones. In in vitro test, one transgenic aspen line expressing pine STS gene showed increased resistance to decay fungus Phellinus tremulae. In the field, chitinase transgenic birch lines were more susceptible to leaf spot (Pyrenopeziza betulicola) than the non-transgenic control clone while the resistance against birch rust (Melampsoridium betulinum) was improved. No changes in the content or composition of lignin were detected in the 4CL antisense birch lines. In order to evaluate the ecological effects of the produced GM trees on non-target organisms, an in vitro mycorrhiza experiment with Paxillus involutus and a decomposition experiment in the field were performed. The expression of a transgenic chitinase did not disturb the establishment of mycorrhizal symbiosis between birch and P. involutus in vitro. 4CL antisense transformed birch lines showed retarded root growth but were able to form normal ectomycorrhizal associations with the mycorrhizal fungus in vitro. 4CL lines also showed normal litter decomposition. Unexpected growth reductions resulting from the gene transformation were observed in chitinase transgenic and 4CL antisense birch lines. These results indicate that genetic engineering can provide a tool in increasing disease resistance in Finnish tree species. More extensive data with several ectomycorrhizal species is needed to evaluate the consequences of transgene expression on beneficial plant-fungus symbioses. The potential pleiotropic effects of the transgene should also be taken into account when considering the safety of transgenic trees.
Resumo:
The aim of this thesis was to develop measurement techniques and systems for measuring air quality and to provide information about air quality conditions and the amount of gaseous emissions from semi-insulated and uninsulated dairy buildings in Finland and Estonia. Specialization and intensification in livestock farming, such as in dairy production, is usually accompanied by an increase in concentrated environmental emissions. In addition to high moisture, the presence of dust and corrosive gases, and widely varying gas concentrations in dairy buildings, Finland and Estonia experience winter temperatures reaching below -40 ºC and summer temperatures above +30 ºC. The adaptation of new technologies for long-term air quality monitoring and measurement remains relatively uncommon in dairy buildings because the construction and maintenance of accurate monitoring systems for long-term use are too expensive for the average dairy farmer to afford. Though the documentation of accurate air quality measurement systems intended mainly for research purposes have been made in the past, standardised methods and the documentation of affordable systems and simple methods for performing air quality and emissions measurements in dairy buildings are unavailable. In this study, we built three measurement systems: 1) a Stationary system with integrated affordable sensors for on-site measurements, 2) a Wireless system with affordable sensors for off-site measurements, and 3) a Mobile system consisting of expensive and accurate sensors for measuring air quality. In addition to assessing existing methods, we developed simplified methods for measuring ventilation and emission rates in dairy buildings. The three measurement systems were successfully used to measure air quality in uninsulated, semi-insulated, and fully-insulated dairy buildings between the years 2005 and 2007. When carefully calibrated, the affordable sensors in the systems gave reasonably accurate readings. The spatial air quality survey showed high variation in microclimate conditions in the dairy buildings measured. The average indoor air concentration for carbon dioxide was 950 ppm, for ammonia 5 ppm, for methane 48 ppm, for relative humidity 70%, and for inside air velocity 0.2 m/s. The average winter and summer indoor temperatures during the measurement period were -7º C and +24 ºC for the uninsulated, +3 ºC and +20 ºC for the semi-insulated and +10 ºC and +25 ºC for the fully-insulated dairy buildings. The measurement results showed that the uninsulated dairy buildings had lower indoor gas concentrations and emissions compared to fully insulated buildings. Although occasionally exceeded, the ventilation rates and average indoor air quality in the dairy buildings were largely within recommended limits. We assessed the traditional heat balance, moisture balance, carbon dioxide balance and direct airflow methods for estimating ventilation rates. The direct velocity measurement for the estimation of ventilation rate proved to be impractical for naturally ventilated buildings. Two methods were developed for estimating ventilation rates. The first method is applicable in buildings in which the ventilation can be stopped or completely closed. The second method is useful in naturally ventilated buildings with large openings and high ventilation rates where spatial gas concentrations are heterogeneously distributed. The two traditional methods (carbon dioxide and methane balances), and two newly developed methods (theoretical modelling using Fick s law and boundary layer theory, and the recirculation flux-chamber technique) were used to estimate ammonia emissions from the dairy buildings. Using the traditional carbon dioxide balance method, ammonia emissions per cow from the dairy buildings ranged from 7 g day-1 to 35 g day-1, and methane emissions per cow ranged from 96 g day-1 to 348 g day-1. The developed methods proved to be as equally accurate as the traditional methods. Variation between the mean emissions estimated with the traditional and the developed methods was less than 20%. The developed modelling procedure provided sound framework for examining the impact of production systems on ammonia emissions in dairy buildings.
Resumo:
We review here research on semiochemicals for cotton pest management carried out in successive Cotton Co-operative Research Centres from 1998 to 2012. Australian cotton is now dominated by transgenic (Bt) varieties, which provide a strong platform for integrated pest management of key pests such as Helicoverpa spp., but new technologies are required to manage the development of resistance in Helicoverpa spp. to transgenic cotton and the problems posed by emerging and secondary pests, especially sucking insects. A long-range attractant for Helicoverpa moths, based on plant volatiles, has been commercialised as Magnet®. The product has substantial area-wide impacts on moth populations, and only limited effects on beneficial insects. Potential roles are being investigated for this product in resistance management of Helicoverpa spp. on transgenic cotton. Short-range, non-volatile compounds on organ surfaces of plants that do not support development of Helicoverpa spp. have been identified; these compounds deter feeding or oviposition, or are toxic to insect pests. One such product, Sero X®, is effective on Helicoverpa spp. and sucking pests such as whiteflies (Bemisia tabaci), green mirids (Creontiades dilutus), and other hemipteran insects, and is in the advanced stages of commercialisation.
Resumo:
This study investigates the relationship between per capita carbon dioxide (CO2) emissions and per capita GDP in Australia, while controlling for technological state as measured by multifactor productivity and export of black coal. Although technological progress seems to play a critical role in achieving long term goals of CO2 reduction and economic growth, empirical studies have often considered time trend to proxy technological change. However, as discoveries and diffusion of new technologies may not progress smoothly with time, the assumption of a deterministic technological progress may be incorrect in the long run. The use of multifactor productivity as a measure of technological state, therefore, overcomes the limitations and provides practical policy directions. This study uses recently developed bound-testing approach, which is complemented by Johansen- Juselius maximum likelihood approach and a reasonably large sample size to investigate the cointegration relationship. Both of the techniques suggest that cointegration relationship exists among the variables. The long-run and short-run coefficients of CO2 emissions function is estimated using ARDL approach. The empirical findings in the study show evidence of the existence of Environmental Kuznets Curve type relationship for per capita CO2 emissions in the Australian context. The technology as measured by the multifactor productivity, however, is not found as an influencing variable in emissionsincome trajectory.
Resumo:
This study explores the EMU stand taken by the major Finnish political parties from 1994 to 1999. The starting point is the empirical evidence showing that party responses to European integration are shaped by a mix of national and cross-national factors, with national factors having more explanatory value. The study is the first to produce evidence that classified party documents such as protocols, manifestos and authoritative policy summaries may describe the EMU policy emphasis. In fact, as the literature review demonstrates, it has been unclear so far what kind of stand the three major Finnish political parties took during 1994–1999. Consequently, this study makes a substantive contribution to understanding the factors that shaped EMU party policies, and eventually, the national EMU policy during the 1990s. The research questions addressed are the following: What are the main factors that shaped partisan standpoints on EMU during 1994–1999? To what extent did the policy debate and themes change in the political parties? How far were the policies of the Social Democratic Party, the Centre Party and the National Coalition Party shaped by factors unique to their own national contexts? Furthermore, to what extent were they determined by cross-national influences from abroad, and especially from countries with which Finland has a special relationship, such as Sweden? The theoretical background of the study is in the area of party politics and approaches to EU policies, and party change, developed mainly by Kevin Featherstone, Peter Mair and Richard Katz. At the same time, it puts forward generic hypotheses that help to explain party standpoints on EMU. It incorporates a large quantity of classified new material based on primary research through content analysis and interviews. Quantitative and qualitative methods are used sequentially in order to overcome possible limitations. Established content-analysis techniques improve the reliability of the data. The coding frame is based on the salience theory of party competition. Interviews with eight party leaders and one independent expert civil servant provided additional insights and improve the validity of the data. Public-opinion surveys and media coverage are also used to complete the research path. Four major conclusions are drawn from the research findings. First, the quantitative and the interview data reveal the importance of the internal influences within the parties that most noticeably shaped their EMU policies during the 1990s. In contrast, international events play a minor role. The most striking feature turned out to be the strong emphasis by all of the parties on economic goals. However, it is important to note that the factors manifest differences between economic, democratic and international issues across the three major parties. Secondly, it seems that the parties have transformed into centralised and professional organisations in terms of their EMU policy-making. The weight and direction of party EMU strategy rests within the leadership and a few administrative elites. This could imply changes in their institutional environment. Eventually, parties may appear generally less differentiated and more standardised in their policy-making. Thirdly, the case of the Social Democratic Party shows that traditional organisational links continue to exist between the left and the trade unions in terms of their EMU policy-making. Hence, it could be that the parties have not yet moved beyond their conventional affiliate organisations. Fourthly, parties tend to neglect citizen opinion and demands with regard to EMU, which could imply conflict between the changes in their strategic environment. They seem to give more attention to the demands of political competition (party-party relationships) than to public attitudes (party-voter relationships), which would imply that they have had to learn to be more flexible and responsive. Finally, three suggestions for institutional reform are offered, which could contribute to the emergence of legitimised policy-making: measures to bring more party members and voter groups into the policy-making process; measures to adopt new technologies in order to open up the policy-formation process in the early phase; and measures to involve all interest groups in the policy-making process.
Resumo:
The emergence of new technologies has revolutionized the way companies interact and build relationships with customers. The channel–customer relationship has traditionally been managed via a push approach in communication (“What can we sell customers?”) with the hope of cultivating customer loyalty. However, emotional understandings of customers and how they feel about a product, service, or business can drastically alter consumers’ engagement, behavior, and purchasing preferences. This rapidly evolving landscape has left managers at a loss, and what they are experiencing is likely the beginning of a tectonic shift in the way digital channels are designed, monitored, and managed. In this article, digital channel relationships are examined, and useful concepts for clarifying and refining the emotional meaning behind company strategy and their relationship to corresponding digital channels are detailed. Using three case study examples, we discuss the process and impact of such emotionally aware digital channel designs. Recommendations are made regarding how companies can select, design, and maintain digital engagements based on their strategy and industry needs.
Resumo:
"New global contexts are presenting new challenges and new possibilities for young children and those around them. Climate change, armed conflict and poverty combine with new frontiers of discovery in science and technology to create a paradoxical picture of both threat and opportunity for our world and our children. On the one hand, children are experiencing unprecedented patterns of disparity and inequity; yet, on the other hand, they have seemingly limitless possibilities to engage with new technologies and social processes. Seismic shifts such as these are inviting new questions about the conditions that young children need to learn and thrive. Diversity in the Early Years: Intercultural Learning and Teaching explores significant aspects of working with children and adults from diverse backgrounds. It is a valuable resource for teaching early childhood pre-service teachers to raise awareness about issues of diversity - whether diversity of culture, language, education and/or gender - and for helping them to develop their own pedagogical approaches to working with diverse populations."--Publisher website
Resumo:
Architecture focuses on designing built environments in response to society’s needs, reflecting culture through materials and forms. The physical boundaries of the city have become blurred through the integration of digital media, connecting the physical environment with the digital. In the recent past the future was imagined as highly technological; 1982 Ridley Scott’s Blade Runner is set in 2019 and introduces a world where supersized screens inject advertisements in the cluttered urban space. Now, in 2015 screens are central to everyday life, but in a completely different way in respect to what had been imagined. Through ubiquitous computing and social media, information is abundant. Digital technologies have changed the way people relate to cities supporting discussion on multiple levels, allowing citizens to be more vocal than ever before. We question how architects can use the affordances of urban informatics to obtain and navigate useful social information to inform design. This chapter investigates different approaches to engage communities in the debate on cities, in particular it aims to capture citizens’ opinions on the use and design of public places. Physical and digital discussions have been initiated to capture citizens’ opinions on the use and design of public places. In addition to traditional consultation methods, Web 2.0 platforms, urban screens, and mobile apps are used in the context of Brisbane, Australia to explore contemporary strategies of engagement (Gray 2014).
Resumo:
This study investigates primary and secondary school teachers’ social representations and ways to conceptualise new technologies. The focus is both on teachers’ descriptions, interpretations and conceptions of technology and on the adoption and formation of these conceptions. In addition, the purpose of this study is to analyse how the national objectives of the information society and the implementation of information and communication technologies (ICT) in schools reflect teachers’ thinking and everyday practices. The starting point for the study is the idea of a dynamic and mutual relationship between teachers and technology so that technology does not affect one-sidedly teachers’ thinking. This relationship is described in this study as the teachers’ technology relationship. This concept emphasises that technology cannot be separated from society, social relations and the context where it is used but it is intertwined with societal practices and is therefore formed in interaction with the material and social factors. The theoretical part of this study encompasses three different research traditions: 1) the social shaping of technology, 2) research on how schools and teachers use technology and 3) social representations theory. The study was part of the Helmi Project (Holistic development of e-Learning and business models) in 2001–2005 at the Helsinki University of Technology, SimLab research unit. The Helmi Project focused on different aspects of the utilisation of ICT in teaching. The research data consisted of interviews of teachers and principals. Altogether 37 interviews were conducted in 2003 and 2004 in six different primary and secondary schools in Espoo, Finland. The data was analysed applying grounded theory. The results showed that the teachers’ technology relationship was diverse and context specific. Technology was interpreted differently depending on the context: the teachers’ technology related descriptions and metaphors illustrated on one hand the benefits and the possibilities and on the other hand the problems and threats of different technologies. The dualist nature of technology was also expressed in the teachers’ thinking about technology as a deterministic and irrevocable force and as a controllable and functional tool at the same time. Teachers did not consider technology as having a stable character but they interpreted technology in relation to the variable context of use. This way they positioned or anchored technology into their everyday practices. The study also analysed the formation of the teachers’ technology relationship and the ways teachers familiarise themselves with new technologies. Comparison of different technologies as well as technology related metaphors turned out to be significant in forming the technology relationship. Also the ways teachers described the familiarisation process and the interpretations of their own technical skills affected the formation of technology relationship. In addition, teachers defined technology together with other teachers, and the discussions reflected teachers’ interpretations and descriptions.
Resumo:
ABSTRACT Idiopathic developmental disorders (DDs) affect ~1% of the population worldwide. This being a considerable amount, efforts are being made to elucidate the disease mechanisms. One or several genetic factors cause 30-40% of DDs, and only 10% are caused by environmental factors. The remaining 50% of DD patients go undiagnosed, mostly due to a lack of diagnostic techniques. The cause in most undiagnosed cases is though to be a genetic factor or a combination of genetic and environmental factors. Despite the surge of new technologies entering the market, their implementation into diagnostic laboratories is hampered by costs, lack of information about the expected diagnostic yield, and the wide range of selection. This study evaluates new microarray methods in diagnosing idiopathic DDs, providing information about their added diagnostic value. Study I analysed 150 patients by array comparative genomic hybridization (array CGH, 44K and 244K), with a subsequent 18% diagnostic yield. These results are supported by other studies, indicating an enourmous added diagnostic value of array CGH, compared with conventional cytogenetic analysis. Nevertheless, 80% of the patients remained undiagnosed in Study I. In an effort to diagnose more patients, in Study IV the resolution was increased from 8.9 Kb of the 244K CGH array to 0.7 Kb, by using a single-nucleotide polymorphism (SNP) array. However, no additional pathogenic changes were detected in the 35 patients assessed, and thus, for diagnostic purposes, an array platform with ca 9 Kb resolution appears adequate. The recent vast increase in reports of detected aberrations and associated phenotypes has enabled characterization of several new syndromes first based on a common aberration and thereafter by delineation of common clinical characteristics. In Study II, a familial deletion at 9q22.2q22.32 with variable penetrance was described. Despite several reports of aberrations in the adjacent area at 9q associated with Gorlin syndrome, the patients in this family had a unique phenotype and did not present with the syndrome. In Study III, a familial duplication of chromosome 6p22.2 was described. The duplication caused increased expression of an important enzyme of the γ-aminobutyric acid (GABA) degradation pathway, causing oxidative stress of the brain, and thus, very likely, the mild mental retardation of these patients. These two case studies attempted to pinpoint candidate genes and to resolve the pathogenic mechanism causing the clinical characteristics of the patients. Presenting rare genetic and clinical findings to the international science and medical community enables interpretation of similar findings in other patients. The added value of molecular karyotyping in patients with idiopathic DD is evident. As a first line of testing, arrays with a median resolution of at least 9 Kb should be considered and further characterization of detected aberrations undertaken when possible. Diagnostic whole-exome sequencing may be the best option for patients who remain undiagnosed after high-resolution array analysis.
Resumo:
Resumen: La globalización de mercados, la incorporación de nuevas tecnologías, la institucionalización y consolidación de los estudios universitarios y la creciente competitividad de la industria han impuesto cambios importantes en la comunicación comercial. La inexistencia de censos e investigaciones sobre las características de los trabajadores del sector hacían necesario realizar una radiografía la fuerza laboral publicitaria en Argentina. Como los datos señalan, se trata de un ámbito menos feminizado que en otros países, joven y con una alta satisfacción laboral.
Resumo:
[ES] Danka Multimedia es una sociedad cooperativa que ofrece servicios de comunicación integral en el ámbito de marketing y la formación mediante el uso de las nuevas tecnologías. Por el sector de actividad en el que se encuentra, así como por su propia vocación empresarial, Danka ha experimentado desde sus inicios un proceso paulatino de crecimiento. Este crecimiento está siendo cada más vez progresivo y, por ello, la empresa se está viendo obligada a hacer frente a una serie de problemas derivados del mismo. Estos conflictos derivados del crecimiento, comunes a todo tipo de empresa, se ven acrecentados en el caso de Danka Multimedia por su personalidad jurídica, los valores y cultura que ello conlleva y el sector de actividad en el que desarrolla su actividad. La gestión de este crecimiento, por parte de Danka Multimedia, es el tema central que se aborda a lo largo del caso.