957 resultados para Bibliographic references
Resumo:
Description of the work Shrinking Violets is comprised of two half scale garments in laser cut silk organza, developed with a knotting device to allow for disassembly and reassembly. The first is a jacket in layered red organza including black storm flap details. The second is a vest in jade organza with circles of pink organza attached through a pattern of knots. Research Background This practice-led fashion design research sits within the field of Design for Sustainability (DfS) in fashion that seeks to mitigate the environmental and ethical impacts of fashion consumption and production. The research explores new systems of garment construction for DfS, and examines how these systems may involve ‘designing’ new user interactions with the garments. The garments’ construction system allows them to be disassembled and recycled or reassembled by users to form a new garment. Conventional garment design follows a set process of cutting and construction, with pattern pieces permanently machine-stitched together. Garments typically contain multiple fibre types; for example a jacket may be constructed from a shell of wool/polyester, an acetate lining, fusible interlinings, and plastic buttons. These complex inputs mean that textile recycling is highly labour intensive, first to separate the garment pieces and second to sort the multiple fibre types. This difficulty results in poor quality ‘shoddy’ comprised of many fibre types and unsuitable for new apparel, or in large quantities of recyclable textile waste sent to landfill (Hawley 2011). Design-led approaches that consider the garment’s end of life in the design process are a way of addressing this problem. In Gulich’s (2006) analysis, use of single materials is the most effective way to ensure ease of recycling, with multiple materials that can be detached next in effectiveness. Given the low rate of technological innovation in most apparel manufacturing (Ruiz 2011), a challenge for effective recycling is how to develop new manufacturing methods that allow for garments to be more easily disassembled at end-of-life. Research Contribution This project addresses the research question: How can design for disassembly be considered within the fashion design process? I have employed a practice-led methodology in which my design process leads the research, making use of methods of fashion design practice including garment and construction research, fabric and colour research, textile experimentation, drape, patternmaking, and illustration as well as more recent methods such as laser cutting. Interrogating the traditional approaches to garment construction is necessarily a technical process; however fashion design is as much about the aesthetic and desirability of a garment as it is about the garment’s pragmatics or utility. This requires a balance between the technical demands of designing for disassembly with the aesthetic demands of fashion. This led to the selection of luxurious, semi-transparent fabrics in bold floral colours that could be layered to create multiple visual effects, as well as the experimentation with laser cutting for new forms of finishing and fastening the fabrics together. Shrinking Violets makes two contributions to new knowledge in the area of design for sustainability within fashion. The first is in the technical development of apparel modularity through the system of laser cut holes and knots that also become a patterning device. The second contribution lies in the design of a system for users to engage with the garment through its ability to be easily reconstructed into a new form. Research Significance Shrinking Violets was exhibited at the State Library of Queensland’s Asia Pacific Design Library, 1-5 November 2015, as part of The International Association of Societies of Design Research’s (IASDR) biannual design conference. The work was chosen for display by a panel of experts, based on the criteria of design innovation and contribution to new knowledge in design. References Gulich, B. (2006). Designing textile products that are easy to recycle. In Y. Wang (Ed.), Recycling in Textiles (pp. 25-37). London: Woodhead. Hawley, J. M. (2011). Textile recycling options: exploring what could be. In A. Gwilt & T. Rissanen (Eds.), Shaping Sustainable Fashion: Changing the way we make and use clothes (pp. 143 - 155). London: Earthscan. Ruiz, B. (2014). Global Apparel Manufacturing. Retrieved 10 August 2014, from http://clients1.ibisworld.com/reports/gl/industry/default.aspx?entid=470
Resumo:
This study seeks to fill the gap in the existing literature by examining at how and whether disclosure of social value creation becomes a part of legitimation strategies of social enterprises. By using Suchman’s (1995) moral dimension of legitimacy theory this study sets out that three global social organizations, Grameen Bank, Charity Water, and the Bill and Melinda Gates Foundation, disclose social value creation as if they conform to expectations of the broader community. The study finds that there is an apparent disconnection between disclosure and actions by social enterprises. With references to few incidents highlighted in this study, social enterprises, use disclosures as their managerial efforts, rather than creating moral legitimacy. The notion of apparent disconnection between disclosure and real action by the case social enterprises is common with the notion of the motivation behind disclosure practices by corporations as captured in extant disclosure literature. The finding suggest that when an organisation (whether it is a corporation or a social enterprise) face legitimacy crisis, it appears to disclose good news than bad news questioning organizational moral legitimacy.
Resumo:
Boron neutron capture therapy (BNCT) is a radiotherapy that has mainly been used to treat malignant brain tumours, melanomas, and head and neck cancer. In BNCT, the patient receives an intravenous infusion of a 10B-carrier, which accumulates in the tumour area. The tumour is irradiated with epithermal or thermal neutrons, which result in a boron neutron capture reaction that generates heavy particles to damage tumour cells. In Finland, boronophenylalanine fructose (BPA-F) is used as the 10B-carrier. Currently, the drifting of boron from blood to tumour as well as the spatial and temporal accumulation of boron in the brain, are not precisely known. Proton magnetic resonance spectroscopy (1H MRS) could be used for selective BPA-F detection and quantification as aromatic protons of BPA resonate in the spectrum region, which is clear of brain metabolite signals. This study, which included both phantom and in vivo studies, examined the validity of 1H MRS as a tool for BPA detection. In the phantom study, BPA quantification was studied at 1.5 and 3.0 T with single voxel 1H MRS, and at 1.5 T with magnetic resonance imaging (MRSI). The detection limit of BPA was determined in phantom conditions at 1.5 T and 3.0 T using single voxel 1H MRS, and at 1.5 T using MRSI. In phantom conditions, BPA quantification accuracy of ± 5% and ± 15% were achieved with single voxel MRS using external or internal (internal water signal) concentration references, respectively. For MRSI, a quantification accuracy of <5% was obtained using an internal concentration reference (creatine). The detection limits of BPA in phantom conditions for the PRESS sequence were 0.7 (3.0 T) and 1.4 mM (1.5 T) mM with 20 × 20 × 20 mm3 single voxel MRS, and 1.0 mM with acquisition-weighted MRSI (nominal voxel volume 10(RL) × 10(AP) × 7.5(SI) mm3), respectively. In the in vivo study, an MRSI or single voxel MRS or both was performed for ten patients (patients 1-10) on the day of BNCT. Three patients had glioblastoma multiforme (GBM), and five patients had a recurrent or progressing GBM or anaplastic astrocytoma gradus III, and two patients had head and neck cancer. For nine patients (patients 1-9), MRS/MRSI was performed 70-140 min after the second irradiation field, and for one patient (patient 10), the MRSI study began 11 min before the end of the BPA-F infusion and ended 6 min after the end of the infusion. In comparison, single voxel MRS was performed before BNCT, for two patients (patients 3 and 9), and for one patient (patient 9), MRSI was performed one month after treatment. For one patient (patient 10), MRSI was performed four days before infusion. Signals from the tumour spectrum aromatic region were detected on the day of BNCT in three patients, indicating that in favourable cases, it is possible to detect BPA in vivo in the patient’s brain after BNCT treatment or at the end of BPA-F infusion. However, because the shape and position of the detected signals did not exactly match the BPA spectrum detected in the in vitro conditions, assignment of BPA is difficult. The opportunity to perform MRS immediately after the end of BPA-F infusion for more patients is necessary to evaluate the suitability of 1H MRS for BPA detection or quantification for treatment planning purposes. However, it could be possible to use MRSI as criteria in selecting patients for BNCT.
Resumo:
This paper proposes a control method that can balance the input currents of the three-phase three-wire boost rectifier under unbalanced input voltage condition. The control objective is to operate the rectifier in the high-power-factor mode under balanced input voltage condition but to give overriding priority to the current balance function in case of unbalance in the input voltage. The control structure has been divided into two major functional blocks. The inner loop current-mode controller implements resistor emulation to achieve high-power-factor operation on each of the two orthogonal axes of the stationary reference frame. The outer control loop performs magnitude scaling and phase-shifting operations on current of one of the axes to make it balanced with the current on the other axis. The coefficients of scaling and shifting functions are determined by two closed-loop prportional-integral (PI) controllers that impose the conditions of input current balance as PI references. The control algorithm is simple and high performing. It does not require input voltage sensing and transformation of the control variables into a rotating reference frame. The simulation results on a MATLAB-SIMULINK platform validate the proposed control strategy. In implementation Texas Instrument's digital signal processor TMS320F24OF is used as the digital controller. The control algorithm for high-power-factor operation is tested on a prototype boost rectifier under nominal and unbalanced input voltage conditions.
Resumo:
The aim of this licentiate thesis is to analyse how femininity is constructed in twelve portrait interviews of women in the dailies Dagens Nyheter (Stockholm) and Hufvudstadsbladet (Helsinki) in September 1996, and to explore the portrait interview as a media genre. The qualitative analysis has a feminist and constructionist perspective and is connected to critical text analysis. It was carried out on two levels: first, femininity is identified on the linguistic level by choice of words, and second on the level of content (topical motifs/themes). The portrait interview as a genre constitutes a third dimension in the analysis: The aim is not towards the identification of femininity, but rather towards the identification of the portrait interview a relatively unexplored media genre. References (Swedish: omtal) to the principal character (or protagonist) are traced mainly through reference chains which consist of names, pronouns and substantive phrases. The interviewees were referred to by their full names in Dagens Nyheter (with the exception of the oldest and youngest interviewees, both of whom were mainly referred to by their first names), while the style of reference varied more in Hufvudstadsbladet. The position of the principal character was also analysed through her relation in the text to minor characters from her working life and from her private life. These minor characters maintained their subordinate positions in all of the portraits except that of the youngest principal character, in which the subsidiary voices became at least as strong as the voice of the principal character. Three frequently-recurring topical motifs occurred in the portraits: The first involved explanations for the principal character s success divided into three categories, agent, affect and ambition, the second concerned using journeys or trips as symbols for turning points in life, and the third referred to the ambiguity in the contradiction between private (family/other private life) and public (work) life. This ambiguity is connected to the portrait interview as a text type (genre) which features conclusions at the end of portraits, which in turn is characteristic of reportage. However, the analysis showed that the conclusions of the portrait interviews often also included elements of ambiguity. This was evident in the contradictions be14 tween private and public life that arose in the portrait interviews that focused on these two spheres. The portraits that focused on the principal character s public life showed ambiguity on a more general level concerning questions about being a woman and having a profession, and they often ended with a description of some details of her private life. The women in the portraits were all constructed as being successful, in terms of having achieved direct success, reflective success or success in the form of life wisdom. The women of direct success were described as ambitious individuals with no sidetracks on their life paths, while those of reflective success were described as active heroines who had received help from different agents, who could use their affects as enriching ingredients in life, but who in the end had control over their own lives (life stories). The elderly women were constructed as having achieved life wisdom and their portraits were focused upon the past. The portrait interview as a genre is characterised by journalistic freedom (in relation to the more strict news genre), by a now room (Swedish nurum ) where the journalist meets the principal character (usually via spoken dialogue that she or he transforms into written text to be read by a mass-media audience) and by the relatively closed structure of the portrait. The portrait is relatively independent in relation to the news genre and in relation to the context of what has previously been written, what is being written at the time and what will be written in the future the principal character does not need to belong to the newspaper s usual gallery of actors. Furthermore, the principal character is constructed as being independent in relation to the subsidiary characters and other media actors. The conflict is within the principal character herself and within her life story, unlike the news genre in which equal actors are in conflict with each other. The portrait is also independent in relation to the news lifespan; the publishing timetable is not as strict as in the news genre, but is still dependent on the factors initiating the portrait. The enclosures consist of a raw analysis of two of twelve portrait interviews and of copies of all portraits.
Resumo:
This study investigates the role of social media as a form of organizational knowledge sharing. Social media is investigated in terms of the Web 2.0 technologies that organizations provide their employees as tools of internal communication. This study is anchored in the theoretical understanding of social media as technologies which enable both knowledge collection and knowledge donation. This study investigates the factors influencing employees’ use of social media in their working environment. The study presents the multidisciplinary research tradition concerning knowledge sharing. Social media is analyzed especially in relation to internal communication and knowledge sharing. Based on previous studies, it is assumed that personal, organizational, and technological factors influence employees’ use of social media in their working environment. The research represents a case study focusing on the employees of the Finnish company Wärtsilä. Wärtsilä represents an eligible case organization for this study given that it puts in use several Web 2.0 tools in its intranet. The research is based on quantitative methods. In total 343 answers were obtained with the aid of an online survey which was available in Wärtsilä’s intranet. The associations between the variables are analyzed with the aid of correlations. Finally, with the aid of multiple linear regression analysis the causality between the assumed factors and the use of social media is tested. The analysis demonstrates that personal, organizational and technological factors influence the respondents’ use of social media. As strong predictive variables emerge the benefits that respondents expect to receive from using social media and respondents’ experience in using Web 2.0 in their private lives. Also organizational factors such as managers’ and colleagues’ activeness and organizational guidelines for using social media form a causal relationship with the use of social media. In addition, respondents’ understanding of their responsibilities affects their use of social media. The more social media is considered as a part of individual responsibilities, the more frequently social media is used. Finally, technological factors must be recognized. The more user-friendly social media tools are considered and the better technical skills respondents have, the more frequently social media is used in the working environment. The central references in relation to knowledge sharing include Chun Wei Choo’s (2006) work Knowing Organization, Ikujiro Nonaka and Hirotaka Takeuchi’s (1995) work The Knowledge Creating Company and Linda Argote’s (1999) work Organizational Learning.
Resumo:
As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform’s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.
Resumo:
As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.
Resumo:
The notion of optimization is inherent in protein design. A long linear chain of twenty types of amino acid residues are known to fold to a 3-D conformation that minimizes the combined inter-residue energy interactions. There are two distinct protein design problems, viz. predicting the folded structure from a given sequence of amino acid monomers (folding problem) and determining a sequence for a given folded structure (inverse folding problem). These two problems have much similarity to engineering structural analysis and structural optimization problems respectively. In the folding problem, a protein chain with a given sequence folds to a conformation, called a native state, which has a unique global minimum energy value when compared to all other unfolded conformations. This involves a search in the conformation space. This is somewhat akin to the principle of minimum potential energy that determines the deformed static equilibrium configuration of an elastic structure of given topology, shape, and size that is subjected to certain boundary conditions. In the inverse-folding problem, one has to design a sequence with some objectives (having a specific feature of the folded structure, docking with another protein, etc.) and constraints (sequence being fixed in some portion, a particular composition of amino acid types, etc.) while obtaining a sequence that would fold to the desired conformation satisfying the criteria of folding. This requires a search in the sequence space. This is similar to structural optimization in the design-variable space wherein a certain feature of structural response is optimized subject to some constraints while satisfying the governing static or dynamic equilibrium equations. Based on this similarity, in this work we apply the topology optimization methods to protein design, discuss modeling issues and present some initial results.
Resumo:
32P labelled 5S RNA isolated fromMycobacterium smegmatis was digested withT 1 and pancreatic ribonucleases separately and fingerprinted by two dimensional high voltage electrophoresis on thin-layer DEAE-cellulose plates. The radioactive spots were sequenced and their molar yields were determined. The chain length of the 5S RNA was found to be 120. It showed resemblances to both prokaryotic and eukaryotic 5S RNAs.
Resumo:
The Lucianic text of the Septuagint of the Historical Books witnessed primarily by the manuscript group L (19, 82, 93, 108, and 127) consists of at least two strata: the recensional elements, which date back to about 300 C.E., and the substratum under these recensional elements, the proto-Lucianic text. Some distinctive readings in L seem to be supported by witnesses that antedate the supposed time of the recension. These witnesses include the biblical quotations of Josephus, Hippolytus, Irenaeus, Tertullian, and Cyprian, and the Old Latin translation of the Septuagint. It has also been posited that some Lucianic readings might go back to Hebrew readings that are not found in the Masoretic text but appear in the Qumran biblical texts. This phenomenon constitutes the proto-Lucianic problem. In chapter 1 the proto-Lucianic problem and its research history are introduced. Josephus references to 1 Samuel are analyzed in chapter 2. His agreements with L are few and are mostly only apparent or, at best, coincidental. In chapters 3 6 the quotations by four early Church Fathers are analyzed. Hippolytus Septuagint text is extremely hard to establish since his quotations from 1 Samuel have only been preserved in Armenian and Georgian translations. Most of the suggested agreements between Hippolytus and L are only apparent or coincidental. Irenaeus is the most trustworthy textual witness of the four early Church Fathers. His quotations from 1 Samuel agree with L several times against codex Vaticanus (B) and all or most of the other witnesses in preserving the original text. Tertullian and Cyprian agree with L in attesting some Hebraizing approximations that do not seem to be of Hexaplaric origin. The question is more likely of early Hebraizing readings of the same tradition as the kaige recension. In chapter 7 it is noted that Origen, although a pre-Lucianic Father, does not qualify as a proto-Lucianic witness. General observations about the Old Latin witnesses as well as an analysis of the manuscript La115 are given in chapter 8. In chapter 9 the theory of the proto-Lucianic recension is discussed. In order to demonstrate the existence of the proto-Lucianic recension one should find instances of indisputable agreement between the Qumran biblical manuscripts and L in readings that are secondary in Greek. No such case can be found in the Qumran material in 1 Samuel. In the text-historical conclusions (chapter 10) it is noted that of all the suggested proto-Lucianic agreements in 1 Samuel (about 75 plus 70 in La115) more than half are only apparent or, at best, coincidental. Of the indisputable agreements, however, 26 are agreements in the original reading. In about 20 instances the agreement is in a secondary reading. These agreements are early variants; mostly minor changes that happen all the time in the course of transmission. Four of the agreements, however, are in a pre-Hexaplaric Hebraizing approximation that has found its way independently into the pre-Lucianic witnesses and the Lucianic recension. The study aims at demonstrating the value of the Lucianic text as a textual witness: under the recensional layer(s) there is an ancient text that preserves very old, even original readings which have not been preserved in B and most of the other witnesses. The study also confirms the value of the early Church Fathers as textual witnesses.
Resumo:
This master thesis studies how trade liberalization affects the firm-level productivity and industrial evolution. To do so, I built a dynamic model that considers firm-level productivity as endogenous to investigate the influence of trade on firm’s productivity and the market structure. In the framework, heterogeneous firms in the same industry operate differently in equilibrium. Specifically, firms are ex ante identical but heterogeneity arises as an equilibrium outcome. Under the setting of monopolistic competition, this type of model yields an industry that is represented not by a steady-state outcome, but by an evolution that rely on the decisions made by individual firms. I prove that trade liberalization has a general positive impact on technological adoption rates and hence increases the firm-level productivity. Besides, this endogenous technology adoption model also captures the stylized facts: exporting firms are larger and more productive than their non-exporting counterparts in the same sector. I assume that the number of firms is endogenous, since, according to the empirical literature, the industrial evolution shows considerably different patterns across countries; some industries experience large scale of firms’ exit in the period of contracting market shares, while some industries display relative stable number of firms or gradually increase quantities. The special word “shakeout” is used to describe the dramatic decrease in the number of firms. In order to explain the causes of shakeout, I construct a model where forward-looking firms decide to enter and exit the market on the basis of their state of technology. In equilibrium, firms choose different dates to adopt innovation which generate a gradual diffusion process. It is exactly this gradual diffusion process that generates the rapid, large-scale exit phenomenon. Specifically, it demonstrates that there is a positive feedback between firm’s exit and adoption, the reduction in the number of firms increases the incentives for remaining firms to adopt innovation. Therefore, in the setting of complete information, this model not only generates a shakeout but also captures the stability of an industry. However, the solely national view of industrial evolution neglects the importance of international trade in determining the shape of market structure. In particular, I show that the higher trade barriers lead to more fragile markets, encouraging the over-entry in the initial stage of industry life cycle and raising the probability of a shakeout. Therefore, more liberalized trade generates more stable market structure from both national and international viewpoints. The main references are Ederington and McCalman(2008,2009).
Resumo:
An electron-beam melting and centrifugal splat-quenching technique for the production of microflakes of Ti-6A1-4V (wt%) alloy quenched at an average cooling rate of about 105 K sec–1 is described. The effect of substrate angle on the shape, size, microstructure and average cooling rate of the flakes of major sieve fractions is discussed. Morphologies of particles of minor sieve fractions are dealt with briefly.