930 resultados para Design-manufacturing integration


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today, databases have become an integral part of information systems. In the past two decades, we have seen different database systems being developed independently and used in different applications domains. Today's interconnected networks and advanced applications, such as data warehousing, data mining & knowledge discovery and intelligent data access to information on the Web, have created a need for integrated access to such heterogeneous, autonomous, distributed database systems. Heterogeneous/multidatabase research has focused on this issue resulting in many different approaches. However, a single, generally accepted methodology in academia or industry has not emerged providing ubiquitous intelligent data access from heterogeneous, autonomous, distributed information sources. This thesis describes a heterogeneous database system being developed at Highperformance Database Research Center (HPDRC). A major impediment to ubiquitous deployment of multidatabase technology is the difficulty in resolving semantic heterogeneity. That is, identifying related information sources for integration and querying purposes. Our approach considers the semantics of the meta-data constructs in resolving this issue. The major contributions of the thesis work include: (i.) providing a scalable, easy-to-implement architecture for developing a heterogeneous multidatabase system, utilizing Semantic Binary Object-oriented Data Model (Sem-ODM) and Semantic SQL query language to capture the semantics of the data sources being integrated and to provide an easy-to-use query facility; (ii.) a methodology for semantic heterogeneity resolution by investigating into the extents of the meta-data constructs of component schemas. This methodology is shown to be correct, complete and unambiguous; (iii.) a semi-automated technique for identifying semantic relations, which is the basis of semantic knowledge for integration and querying, using shared ontologies for context-mediation; (iv.) resolutions for schematic conflicts and a language for defining global views from a set of component Sem-ODM schemas; (v.) design of a knowledge base for storing and manipulating meta-data and knowledge acquired during the integration process. This knowledge base acts as the interface between integration and query processing modules; (vi.) techniques for Semantic SQL query processing and optimization based on semantic knowledge in a heterogeneous database environment; and (vii.) a framework for intelligent computing and communication on the Internet applying the concepts of our work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Relatório de Estágio para a obtenção do grau de Mestre na área de Educação e Comunicação Multimédia

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report is a review of additive and subtractive manufacturing techniques. This approach (additive manufacturing) has resided largely in the prototyping realm, where the methods of producing complex freeform solid objects directly from a computer model without part-specific tooling or knowledge. But these technologies are evolving steadily and are beginning to encompass related systems of material addition, subtraction, assembly, and insertion of components made by other processes. Furthermore, these various additive processes are starting to evolve into rapid manufacturing techniques for mass-customized products, away from narrowly defined rapid prototyping. Taking this idea far enough down the line, and several years hence, a radical restructuring of manufacturing could take place. Manufacturing itself would move from a resource base to a knowledge base and from mass production of single use products to mass customized, high value, life cycle products, majority of research and development was focused on advanced development of existing technologies by improving processing performance, materials, modelling and simulation tools, and design tools to enable the transition from prototyping to manufacturing of end use parts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measurement and verification of products and processes during the early design is attracting increasing interest from high value manufacturing industries. Measurement planning is deemed as an effective means to facilitate the integration of the metrology activity into a wider range of production processes. However, the literature reveals that there are very few research efforts in this field, especially regarding large volume metrology. This paper presents a novel approach to accomplish instruments selection, the first stage of measurement planning process, by mapping measurability characteristics between specific measurement assignments and instruments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aircraft manufacturing industries are looking for solutions in order to increase their productivity. One of the solutions is to apply the metrology systems during the production and assembly processes. Metrology Process Model (MPM) (Maropoulos et al, 2007) has been introduced which emphasises metrology applications with assembly planning, manufacturing processes and product designing. Measurability analysis is part of the MPM and the aim of this analysis is to check the feasibility for measuring the designed large scale components. Measurability Analysis has been integrated in order to provide an efficient matching system. Metrology database is structured by developing the Metrology Classification Model. Furthermore, the feature-based selection model is also explained. By combining two classification models, a novel approach and selection processes for integrated measurability analysis system (MAS) are introduced and such integrated MAS could provide much more meaningful matching results for the operators. © Springer-Verlag Berlin Heidelberg 2010.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Integrating information from multiple sources is a crucial function of the brain. Examples of such integration include multiple stimuli of different modalties, such as visual and auditory, multiple stimuli of the same modality, such as auditory and auditory, and integrating stimuli from the sensory organs (i.e. ears) with stimuli delivered from brain-machine interfaces.

The overall aim of this body of work is to empirically examine stimulus integration in these three domains to inform our broader understanding of how and when the brain combines information from multiple sources.

First, I examine visually-guided auditory, a problem with implications for the general problem in learning of how the brain determines what lesson to learn (and what lessons not to learn). For example, sound localization is a behavior that is partially learned with the aid of vision. This process requires correctly matching a visual location to that of a sound. This is an intrinsically circular problem when sound location is itself uncertain and the visual scene is rife with possible visual matches. Here, we develop a simple paradigm using visual guidance of sound localization to gain insight into how the brain confronts this type of circularity. We tested two competing hypotheses. 1: The brain guides sound location learning based on the synchrony or simultaneity of auditory-visual stimuli, potentially involving a Hebbian associative mechanism. 2: The brain uses a ‘guess and check’ heuristic in which visual feedback that is obtained after an eye movement to a sound alters future performance, perhaps by recruiting the brain’s reward-related circuitry. We assessed the effects of exposure to visual stimuli spatially mismatched from sounds on performance of an interleaved auditory-only saccade task. We found that when humans and monkeys were provided the visual stimulus asynchronously with the sound but as feedback to an auditory-guided saccade, they shifted their subsequent auditory-only performance toward the direction of the visual cue by 1.3-1.7 degrees, or 22-28% of the original 6 degree visual-auditory mismatch. In contrast when the visual stimulus was presented synchronously with the sound but extinguished too quickly to provide this feedback, there was little change in subsequent auditory-only performance. Our results suggest that the outcome of our own actions is vital to localizing sounds correctly. Contrary to previous expectations, visual calibration of auditory space does not appear to require visual-auditory associations based on synchrony/simultaneity.

My next line of research examines how electrical stimulation of the inferior colliculus influences perception of sounds in a nonhuman primate. The central nucleus of the inferior colliculus is the major ascending relay of auditory information before it reaches the forebrain, and thus an ideal target for understanding low-level information processing prior to the forebrain, as almost all auditory signals pass through the central nucleus of the inferior colliculus before reaching the forebrain. Thus, the inferior colliculus is the ideal structure to examine to understand the format of the inputs into the forebrain and, by extension, the processing of auditory scenes that occurs in the brainstem. Therefore, the inferior colliculus was an attractive target for understanding stimulus integration in the ascending auditory pathway.

Moreover, understanding the relationship between the auditory selectivity of neurons and their contribution to perception is critical to the design of effective auditory brain prosthetics. These prosthetics seek to mimic natural activity patterns to achieve desired perceptual outcomes. We measured the contribution of inferior colliculus (IC) sites to perception using combined recording and electrical stimulation. Monkeys performed a frequency-based discrimination task, reporting whether a probe sound was higher or lower in frequency than a reference sound. Stimulation pulses were paired with the probe sound on 50% of trials (0.5-80 µA, 100-300 Hz, n=172 IC locations in 3 rhesus monkeys). Electrical stimulation tended to bias the animals’ judgments in a fashion that was coarsely but significantly correlated with the best frequency of the stimulation site in comparison to the reference frequency employed in the task. Although there was considerable variability in the effects of stimulation (including impairments in performance and shifts in performance away from the direction predicted based on the site’s response properties), the results indicate that stimulation of the IC can evoke percepts correlated with the frequency tuning properties of the IC. Consistent with the implications of recent human studies, the main avenue for improvement for the auditory midbrain implant suggested by our findings is to increase the number and spatial extent of electrodes, to increase the size of the region that can be electrically activated and provide a greater range of evoked percepts.

My next line of research employs a frequency-tagging approach to examine the extent to which multiple sound sources are combined (or segregated) in the nonhuman primate inferior colliculus. In the single-sound case, most inferior colliculus neurons respond and entrain to sounds in a very broad region of space, and many are entirely spatially insensitive, so it is unknown how the neurons will respond to a situation with more than one sound. I use multiple AM stimuli of different frequencies, which the inferior colliculus represents using a spike timing code. This allows me to measure spike timing in the inferior colliculus to determine which sound source is responsible for neural activity in an auditory scene containing multiple sounds. Using this approach, I find that the same neurons that are tuned to broad regions of space in the single sound condition become dramatically more selective in the dual sound condition, preferentially entraining spikes to stimuli from a smaller region of space. I will examine the possibility that there may be a conceptual linkage between this finding and the finding of receptive field shifts in the visual system.

In chapter 5, I will comment on these findings more generally, compare them to existing theoretical models, and discuss what these results tell us about processing in the central nervous system in a multi-stimulus situation. My results suggest that the brain is flexible in its processing and can adapt its integration schema to fit the available cues and the demands of the task.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper derives a theoretical framework for consideration of both the technologically driven dimensions of mobile payment solutions, and the associated value proposition for customers. Banks promote traditional payment instruments whose value proposition is the management of risk for both consumers and merchants. These instruments are centralised, costly and lack decision support functionality. The ubiquity of the mobile phone has provided a decentralised platform for managing payment processes in a new way, but the value proposition for customers has yet to be elaborated clearly. This inertia has stalled the design of sustainable revenue models for a mobile payments ecosystem. Merchants and consumers in the meantime are being seduced by the convenience of on-line and mobile payment solutions. Adopting the purchase and payment process as the unit of analysis, the current mobile payment landscape is reviewed with respect to the creation and consumption of customer value. From this analysis, a framework is derived juxtaposing customer value, related to what is being paid for, with payment integration, related to how payments are being made. The framework provides a theoretical and practical basis for considering the contribution of mobile technologies to the payment industry. The framework is then used to describe the components of a mobile payments pilot project being run on a trial population of 250 students on a campus in Ireland. In this manner, weaknesses in the value proposition for consumers and merchants were highlighted. Limitations of the framework as a research tool are also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Carbon nanotubes (CNTs) have recently emerged as promising candidates for electron field emission (FE) cathodes in integrated FE devices. These nanostructured carbon materials possess exceptional properties and their synthesis can be thoroughly controlled. Their integration into advanced electronic devices, including not only FE cathodes, but sensors, energy storage devices, and circuit components, has seen rapid growth in recent years. The results of the studies presented here demonstrate that the CNT field emitter is an excellent candidate for next generation vacuum microelectronics and related electron emission devices in several advanced applications.

The work presented in this study addresses determining factors that currently confine the performance and application of CNT-FE devices. Characterization studies and improvements to the FE properties of CNTs, along with Micro-Electro-Mechanical Systems (MEMS) design and fabrication, were utilized in achieving these goals. Important performance limiting parameters, including emitter lifetime and failure from poor substrate adhesion, are examined. The compatibility and integration of CNT emitters with the governing MEMS substrate (i.e., polycrystalline silicon), and its impact on these performance limiting parameters, are reported. CNT growth mechanisms and kinetics were investigated and compared to silicon (100) to improve the design of CNT emitter integrated MEMS based electronic devices, specifically in vacuum microelectronic device (VMD) applications.

Improved growth allowed for design and development of novel cold-cathode FE devices utilizing CNT field emitters. A chemical ionization (CI) source based on a CNT-FE electron source was developed and evaluated in a commercial desktop mass spectrometer for explosives trace detection. This work demonstrated the first reported use of a CNT-based ion source capable of collecting CI mass spectra. The CNT-FE source demonstrated low power requirements, pulsing capabilities, and average lifetimes of over 320 hours when operated in constant emission mode under elevated pressures, without sacrificing performance. Additionally, a novel packaged ion source for miniature mass spectrometer applications using CNT emitters, a MEMS based Nier-type geometry, and a Low Temperature Cofired Ceramic (LTCC) 3D scaffold with integrated ion optics were developed and characterized. While previous research has shown other devices capable of collecting ion currents on chip, this LTCC packaged MEMS micro-ion source demonstrated improvements in energy and angular dispersion as well as the ability to direct the ions out of the packaged source and towards a mass analyzer. Simulations and experimental design, fabrication, and characterization were used to make these improvements.

Finally, novel CNT-FE devices were developed to investigate their potential to perform as active circuit elements in VMD circuits. Difficulty integrating devices at micron-scales has hindered the use of vacuum electronic devices in integrated circuits, despite the unique advantages they offer in select applications. Using a combination of particle trajectory simulation and experimental characterization, device performance in an integrated platform was investigated. Solutions to the difficulties in operating multiple devices in close proximity and enhancing electron transmission (i.e., reducing grid loss) are explored in detail. A systematic and iterative process was used to develop isolation structures that reduced crosstalk between neighboring devices from 15% on average, to nearly zero. Innovative geometries and a new operational mode reduced grid loss by nearly threefold, thereby improving transmission of the emitted cathode current to the anode from 25% in initial designs to 70% on average. These performance enhancements are important enablers for larger scale integration and for the realization of complex vacuum microelectronic circuits.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Atomic ions trapped in micro-fabricated surface traps can be utilized as a physical platform with which to build a quantum computer. They possess many of the desirable qualities of such a device, including high fidelity state preparation and readout, universal logic gates, long coherence times, and can be readily entangled with each other through photonic interconnects. The use of optical cavities integrated with trapped ion qubits as a photonic interface presents the possibility for order of magnitude improvements in performance in several key areas of their use in quantum computation. The first part of this thesis describes the design and fabrication of a novel surface trap for integration with an optical cavity. The trap is custom made on a highly reflective mirror surface and includes the capability of moving the ion trap location along all three trap axes with nanometer scale precision. The second part of this thesis demonstrates the suitability of small micro-cavities formed from laser ablated fused silica substrates with radii of curvature in the 300-500 micron range for use with the mirror trap as part of an integrated ion trap cavity system. Quantum computing applications for such a system include dramatic improvements in the photonic entanglement rate up to 10 kHz, the qubit measurement time down to 1 microsecond, and the measurement error rates down to the 10e-5 range. The final part of this thesis details a performance simulator for exploring the physical resource requirements and performance demands to scale such a quantum computer to sizes capable of performing quantum algorithms beyond the limits of classical computation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Computer-Aided-Design (CAD) and Computer-Aided-Manufacture (CAM) has been developed to fabricate fixed dental restorations accurately, faster and improve cost effectiveness of manufacture when compared to the conventional method. Two main methods exist in dental CAD/CAM technology: the subtractive and additive methods. While fitting accuracy of both methods has been explored, no study yet has compared the fabricated restoration (CAM output) to its CAD in terms of accuracy. The aim of this present study was to compare the output of various dental CAM routes to a sole initial CAD and establish the accuracy of fabrication. The internal fit of the various CAM routes were also investigated. The null hypotheses tested were: 1) no significant differences observed between the CAM output to the CAD and 2) no significant differences observed between the various CAM routes. Methods: An aluminium master model of a standard premolar preparation was scanned with a contact dental scanner (Incise, Renishaw, UK). A single CAD was created on the scanned master model (InciseCAD software, V2.5.0.140, UK). Twenty copings were then fabricated by sending the single CAD to a multitude of CAM routes. The copings were grouped (n=5) as: Laser sintered CoCrMo (LS), 5-axis milled CoCrMo (MCoCrMo), 3-axis milled zirconia (ZAx3) and 4-axis milled zirconia (ZAx4). All copings were micro-CT scanned (Phoenix X-Ray, Nanotom-S, Germany, power: 155kV, current: 60µA, 3600 projections) to produce 3-Dimensional (3D) models. A novel methodology was created to superimpose the micro-CT scans with the CAD (GOM Inspect software, V7.5SR2, Germany) to indicate inaccuracies in manufacturing. The accuracy in terms of coping volume was explored. The distances from the surfaces of the micro-CT 3D models to the surfaces of the CAD model (CAD Deviation) were investigated after creating surface colour deviation maps. Localised digital sections of the deviations (Occlusal, Axial and Cervical) and selected focussed areas were then quantitatively measured using software (GOM Inspect software, Germany). A novel methodology was also explored to digitally align (Rhino software, V5, USA) the micro-CT scans with the master model to investigate internal fit. Fifty digital cross sections of the aligned scans were created. Point-to-point distances were measured at 5 levels at each cross section. The five levels were: Vertical Marginal Fit (VF), Absolute Marginal Fit (AM), Axio-margin Fit (AMF), Axial Fit (AF) and Occlusal Fit (OF). Results: The results of the volume measurement were summarised as: VM-CoCrMo (62.8mm3 ) > VZax3 (59.4mm3 ) > VCAD (57mm3 ) > VZax4 (56.1mm3 ) > VLS (52.5mm3 ) and were all significantly different (p presented as areas with different colour. No significant differences were observed at the internal aspect of the cervical aspect between all groups of copings. Significant differences (p< M-CoCrMo Internal Occlusal, Internal Axial and External Axial 2 ZAx3 > ZAx4 External Occlusal, External Cervical 3 ZAx3 < ZAx4 Internal Occlusal 4 M-CoCrMo > ZAx4 Internal Occlusal and Internal Axial The mean values of AMF and AF were significantly (p M-CoCrMo and CAD > ZAx4. Only VF of M-CoCrMo was comparable with the CAD Internal Fit. All VF and AM values were within the clinically acceptable fit (120µm). Conclusion: The investigated CAM methods reproduced the CAD accurately at the internal cervical aspect of the copings. However, localised deviations at axial and occlusal aspects of the copings may suggest the need for modifications in these areas prior to fitting and veneering with porcelain. The CAM groups evaluated also showed different levels of Internal Fit thus rejecting the null hypotheses. The novel non-destructive methodologies for CAD/CAM accuracy and internal fit testing presented in this thesis may be a useful evaluation tool for similar applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The integration of mathematics and science in secondary schools in the 21st century continues to be an important topic of practice and research. The purpose of my research study, which builds on studies by Frykholm and Glasson (2005) and Berlin and White (2010), is to explore the potential constraints and benefits of integrating mathematics and science in Ontario secondary schools based on the perspectives of in-service and pre-service teachers with various math and/or science backgrounds. A qualitative and quantitative research design with an exploratory approach was used. The qualitative data was collected from a sample of 12 in-service teachers with various math and/or science backgrounds recruited from two school boards in Eastern Ontario. The quantitative and some qualitative data was collected from a sample of 81 pre-service teachers from the Queen’s University Bachelor of Education (B.Ed) program. Semi-structured interviews were conducted with the in-service teachers while a survey and a focus group was conducted with the pre-service teachers. Once the data was collected, the qualitative data were abductively analyzed. For the quantitative data, descriptive and inferential statistics (one-way ANOVAs and Pearson Chi Square analyses) were calculated to examine perspectives of teachers regardless of teaching background and to compare groups of teachers based on teaching background. The findings of this study suggest that in-service and pre-service teachers have a positive attitude towards the integration of math and science and view it as valuable to student learning and success. The pre-service teachers viewed the integration as easy and did not express concerns to this integration. On the other hand, the in-service teachers highlighted concerns and challenges such as resources, scheduling, and time constraints. My results illustrate when teachers perceive it is valuable to integrate math and science and which aspects of the classroom benefit best from the integration. Furthermore, the results highlight barriers and possible solutions to better the integration of math and science. In addition to the benefits and constraints of integration, my results illustrate why some teachers may opt out of integrating math and science and the different strategies teachers have incorporated to integrate math and science in their classroom.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Development Permit System has been introduce with minimal directives for establishing a decision making process. This is in opposition to the long established process for minor variances and suggests that the Development Permit System does not necessarily incorporate all of Ontario’s fundamental planning principles. From this concept, the study aimed to identify how minor variances are incorporated into the Development Permit System. In order to examine this topic, the research was based around the following research questions: • How are ‘minor variance’ applications processed within the DPS? • To what extent do the four tests of a minor variance influence the outcomes of lower level applications in the DPS approval process? A case study approach was used for this research. The single-case design employed both qualitative and quantitative research methods including a review of academic literature, court cases, and official documents, as well as a content analysis of Class 1, 1A, and 2 Development Permit application files from the Town of Carleton Place that were decided between 2011 and 2015. Upon the completion of the content analysis, it was found that minor variance issues were most commonly assigned to Class 1 applications. Planning staff generally met approval timelines and embraced their delegated approval authority, readily attaching conditions to applications in order to mitigate off-site impacts. While staff met the regulatory requirements of the DPS, ‘minor variance’ applications were largely decided on impact alone, demonstrating that the principles established by the four tests, the defining quality of the minor variance approval process, had not transferred to the Development Permit System. Alternatively, there was some evidence that the development community has not fully adjusted to the requirements of the new approvals process, as some applications were supported using a rationale containing the four tests. Subsequently, a set of four recommendations were offered which reflect the main themes established by the findings. The first two recommendations are directed towards the Province, the third to municipalities and the fourth to developers and planning consultants: 1) Amend Ontario Regulation 608/06 so that provisions under Section 4(3)(e) fall under Section 4(2). 2) Change the rhetoric from “combining elements of minor variances” to “replacing minor variances”. 3) Establish clear evaluation criteria. 4) Understand the evaluative criteria of the municipality in which you are working.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Navigation devices used to be bulky and expensive and were not widely commercialized for personal use. Nowadays, all useful electronic devices are turning into being handheld so that they can be conveniently used anytime and anywhere. One can claim that almost any mobile phone, used today, has quite strong navigational capabilities that can efficiently work anywhere in the globe. No matter where you are, you can easily know your exact location and make your way smoothly to wherever you would like to go. This couldn’t have been made possible without the existence of efficient and small microwave circuits responsible for the transmission and reception of high quality navigation signals. This thesis is mainly concerned with the design of novel highly miniaturized and efficient filtering components working in the Global Navigational Satellite Systems (GNSS) frequency band to be integrated within an efficient Radio Frequency (RF) front-end module (FEM). A System-on-Package (SoP) integration technique is adopted for the design of all the components in this thesis. Two novel miniaturized filters are designed, where one of them is a wideband filter targeting the complete GNSS band with a fractional bandwidth of almost 50% at a center frequency of 1.385 GHz. This filter utilizes a direct inductive coupling topology to achieve the required wide band performance. It also has very good out-of-band rejection and low IL. Whereas the other dual band filter will only cover the lower and upper GNSS bands with a rejection notch in between the two bands. It has very good inter band rejection. The well-known “divide and conquer” design methodology was applied for the design of this filter to help save valuable design and optimization time. Moreover, the performance of two commercially available ultra-Low Noise Amplifiers (LNAs) is studied. The complete RF FEM showed promising preliminary performance in terms of noise figure, gain and bandwidth, where it out performed other commercial front-ends in these three aspects. All the designed circuits are fabricated and tested. The measured results are found to be in good agreements with the simulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In June 2015, legal frameworks of the Asian Infrastructural Investment Bank were signed by its 57 founding members. Proposed and initiated by China, this multilateral development bank is considered to be an Asian counterpart to break the monopoly of the World Bank and the International Monetary Fund. In October 2015, China’s Central Bank announced a benchmark interest rate cut to combat the economic slowdown. The easing policy coincides with the European Central Bank’s announcement of doubts over US Fed’s commitment to raise interest rates. Global stock markets responded positively to China’s move, with the exception of the indexes from Wall Street (Bland, 2015; Elliott, 2015). In the meantime, China’s ‘One Belt, One Road’ (or New Silk Road Economic Belt) became atopic of discourse in relation to its growing global economy, as China pledged $40 billion to trade and infrastructure projects (Bermingham, 2015). The foreign policy aims to reinforce the economic belt from western China through Central Asia towards Europe, as well as to construct maritime trading routes from coastal China through the South China Sea (Summers, 2015). In 2012, The Economist launched a new China section, to reveal the complexity of the‘meteoric rise’ of China. John Micklethwait, who was then the chief editor of the magazine, said that China’s emergence as a global power justified giving it a section of its own(Roush, 2012). In July 2015, Hu Shuli, the former chief editor of Caijing, announced the launch of a think tank and financial data service division called Caixin Insight Group, which encompasses the new Caixin China Purchasing Managers Index (PMI). Incooperation with with Markit Group, a principal global provider of PMI, the index soon became a widely cited economic indicator. One anecdote from November’s Caixin shows how much has changed: in a high-profile dialogue between Hu Shuli and Kevin Rudd, Hu insisted on asking questions in English; interestingly, the former Prime Minister of Australia insisted on replying in Chinese. These recent developments point to one thing: the economic ascent of China and its increasing influence on the power play between economics and politics in world markets. China has begun to take a more active role in rule making and enforcement under neoliberal frameworks. However, due to the country’s size and the scale of its economy in comparison to other countries, China’s version of globalisation has unique characteristics. The ‘Capitalist-socialist’ paradox is vital to China’s market-oriented transformation. In order to comprehend how such unique features are articulated and understood, there are several questions worth investigating in the realms of media and communication studies,such as how China’s neoliberal restructuring is portrayed and perceived by different types of interested parties, and how these portrayals are de-contextualised and re-contextualised in global or Anglo-American narratives. Therefore, based on a combination of the themes of globalisation, financial media and China’s economic integration, this thesis attempts to explore how financial media construct the narratives of China’s economic globalisation through the deployment of comparative and multi-disciplinary approaches. Two outstanding elite financial magazines, Britain’s The Economist, which has a global readership and influence, and Caijing, China’s leading financial magazine, are chosen as case studies to exemplify differing media discourses, representing, respectively, Anglo-American and Chinese socio-economic and political backgrounds, as well as their own journalistic cultures. This thesis tries to answer the questions of how and why China’s neoliberal restructuring is constructed from a globally-oriented perspective. The construction primarily involves people who are influential in business and policymaking. Hence, the analysis falls into the paradigm of elite-elite communication, which is an important but relatively less developed perspective in studying China and its globalisation. The comparing of characteristics of narrative construction are the result of the textual analysis of articles published over a ten-year period (mid-1998 to mid-2008). The corpus of samples come from the two media outlets’ coverage of three selected events:China becoming a member of the World Trade Organization, its outward direct investment, and the listing of stocks of Chinese companies in overseas exchanges, which are mutually exclusive in sample collection and collectively exhaustive in the inclusion of articles regarding China’s economic globalisation. The findings help to understand that, despite language, socio-economic and political differences, elite financial media with globally-oriented readerships share similar methods of and approaches to agenda setting, the evaluation of news prominence, the selection of frame, and the advocacy of deeply rooted neoliberal ideas. The comparison of their distinctive features reflects the different phases of building up the sense of identity in their readers as global elites, as well as the different economic interests that are aligned with the corresponding readerships. However, textual analysis is only relevant in terms of exploring how the narratives are constructed and the elements they include; textual analysis alone prevents us from seeing the obstacles and the constrains of the journalistic practices of construction. Therefore, this thesis provides a brief discussion of interviews with practitioners from the two media, in order to understand how similar or different narratives are manifested and perceived, how the concept of neoliberalism deviates from and is justified in the Chinese context, and how and for what purpose deviations arise from Western to Chinese contexts. The thesis also contributes to defining financial media in the domain of elite communication. The relevant and closely interlocking concepts of globalisation, elitism and neoliberalism are discussed, and are used as a theoretical bedrock in the analysis of texts and contexts. It is important to address the agenda-setting and ideological role of elite financial media, because of its narrative formula of infusing business facts with opinions,which is important in constructing the global elite identity as well as influencing neoliberal policy-making. On the other hand, ‘journalistic professionalism’ has been redefined, in that the elite identity is shared by the content producer, reader and the actors in the news stories emerging from the much-compressed news cycle. The professionalism of elite financial media requires a dual definition, that of being professional in the understanding of business facts and statistics, and that of being professional in the making sense of stories by deploying economic logic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With applications ranging from aerospace to biomedicine, additive manufacturing (AM) has been revolutionizing the manufacturing industry. The ability of additive techniques, such as selective laser melting (SLM), to create fully functional, geometrically complex, and unique parts out of high strength materials is of great interest. Unfortunately, despite numerous advantages afforded by this technology, its widespread adoption is hindered by a lack of on-line, real time feedback control and quality assurance techniques. In this thesis, inline coherent imaging (ICI), a broadband, spatially coherent imaging technique, is used to observe the SLM process in 15 - 45 $\mu m$ 316L stainless steel. Imaging of both single and multilayer builds is performed at a rate of 200 $kHz$, with a resolution of tens of microns, and a high dynamic range rendering it impervious to blinding from the process beam. This allows imaging before, during, and after laser processing to observe changes in the morphology and stability of the melt. Galvanometer-based scanning of the imaging beam relative to the process beam during the creation of single tracks is used to gain a unique perspective of the SLM process that has been so far unobservable by other monitoring techniques. Single track processing is also used to investigate the possibility of a preliminary feedback control parameter based on the process beam power, through imaging with both coaxial and 100 $\mu m$ offset alignment with respect to the process beam. The 100 $\mu m$ offset improved imaging by increasing the number of bright A-lines (i.e. with signal greater than the 10 $dB$ noise floor) by 300\%. The overlap between adjacent tracks in a single layer is imaged to detect characteristic fault signatures. Full multilayer builds are carried out and the resultant ICI images are used to detect defects in the finished part and improve upon the initial design of the build system. Damage to the recoater blade is assessed using powder layer scans acquired during a 3D build. The ability of ICI to monitor SLM processes at such high rates with high resolution offers extraordinary potential for future advances in on-line feedback control of additive manufacturing.