912 resultados para Digital Manufacturing, Digital Mock Up, Simulation Intent
Resumo:
The principle feature in the evolution of the internet has been its ever growing reach to include old and young, rich and poor. The internet’s ever encroaching presence has transported it from our desktop to our pocket and into our glasses. This is illustrated in the Internet Society Questionnaire on Multistakeholder Governance, which found the main factors affecting change in the Internet governance landscape were more users online from more countries and the influence of the internet over daily life. The omnipresence of the internet is self- perpetuating; its usefulness grows with every new user and every new piece of data uploaded. The advent of social media and the creation of a virtual presence for each of us, even when we are not physically present or ‘logged on’, means we are fast approaching the point where we are all connected, to everyone else, all the time. We have moved far beyond the point where governments can claim to represent our views which evolve constantly rather than being measured in electoral cycles.
The shift, which has seen citizens as creators of content rather than consumers of it, has undermined the centralist view of democracy and created an environment of wiki democracy or crowd sourced democracy. This is at the heart of what is generally known as Web 2.0, and widely considered to be a positive, democratising force. However, we argue, there are worrying elements here too. Government does not always deliver on the promise of the networked society as it involves citizens and others in the process of government. Also a number of key internet companies have emerged as powerful intermediaries harnessing the efforts of the many, and re- using and re-selling the products and data of content providers in the Web 2.0 environment. A discourse about openness and transparency has been offered as a democratising rationale but much of this masks an uneven relationship where the value of online activity flows not to the creators of content but to those who own the channels of communication and the metadata that they produce.
In this context the state is just one stakeholder in the mix of influencers and opinion formers impacting on our behaviours, and indeed our ideas of what is public. The question of what it means to create or own something, and how all these new relationships to be ordered and governed are subject to fundamental change. While government can often appear slow, unwieldy and even irrelevant in much of this context, there remains a need for some sort of political control to deal with the challenges that technology creates but cannot by itself control. In order for the internet to continue to evolve successfully both technically and socially it is critical that the multistakeholder nature of internet governance be understood and acknowledged, and perhaps to an extent, re- balanced. Stakeholders can no longer be classified in the broad headings of government, private sector and civil society, and their roles seen as some sort of benign and open co-production. Each user of the internet has a stake in its efficacy and each by their presence and participation is contributing to the experience, positive or negative of other users as well as to the commercial success or otherwise of various online service providers. However stakeholders have neither an equal role nor an equal share. The unequal relationship between the providers of content and those who simple package up and transmit that content - while harvesting the valuable data thus produced - needs to be addressed. Arguably this suggests a role for government that involves it moving beyond simply celebrating and facilitating the on- going technological revolution. This paper reviews the shifting landscape of stakeholders and their contribution to the efficacy of the internet. It will look to critically evaluate the primacy of the individual as the key stakeholder and their supposed developing empowerment within the ever growing sea of data. It also looks at the role of individuals in wider governance roles. Governments in a number of jurisdictions have sought to engage, consult or empower citizens through technology but in general these attempts have had little appeal. Citizens have been too busy engaging, consulting and empowering each other to pay much attention to what their governments are up to. George Orwell’s view of the future has not come to pass; in fact the internet has insured the opposite scenario has come to pass. There is no big brother but we are all looking over each other’s shoulder all the time, while at the same time a number of big corporations are capturing and selling all this collective endeavour back to us.
Resumo:
BACKGROUND: Lacrimo-auriculo-dento-digital (LADD) syndrome (OMIM #149730) is an autosomal-dominant congenital disorder that can be caused by heterozygous mutations in the tyrosine kinase domains of the genes encoding fibroblast growth factor receptors 2 (FGFR2) and 3 (FGFR3), and has been found in association with a mutation in the FGF10 gene, which encodes an Fgfr ligand. Clinical signs vary, but the condition is characterised by involvement of the lacrimal and salivary systems, cup-shaped ears, hearing loss and dental abnormalities. Additional features may include involvement of the hands and feet with other body systems particularly the kidneys.
CASE REPORT: Previous literature on the subject has been reviewed and this case is the first presentation of LADD syndrome in the Republic of Ireland, as a sporadic case in a 12-year-old girl who exhibited a range of dental and digital anomalies.
TREATMENT: Her general medical practitioner managed her medical care whilst her oral care necessitated a multidisciplinary approach involving restorative and orthodontic elements.
FOLLOW-UP: The initial restorative phase of treatment has successfully improved the appearance of the patient's anterior teeth using direct resin composite build-ups.
Resumo:
Digital pathology and the adoption of image analysis have grown rapidly in the last few years. This is largely due to the implementation of whole slide scanning, advances in software and computer processing capacity and the increasing importance of tissue-based research for biomarker discovery and stratified medicine. This review sets out the key application areas for digital pathology and image analysis, with a particular focus on research and biomarker discovery. A variety of image analysis applications are reviewed including nuclear morphometry and tissue architecture analysis, but with emphasis on immunohistochemistry and fluorescence analysis of tissue biomarkers. Digital pathology and image analysis have important roles across the drug/companion diagnostic development pipeline including biobanking, molecular pathology, tissue microarray analysis, molecular profiling of tissue and these important developments are reviewed. Underpinning all of these important developments is the need for high quality tissue samples and the impact of pre-analytical variables on tissue research is discussed. This requirement is combined with practical advice on setting up and running a digital pathology laboratory. Finally, we discuss the need to integrate digital image analysis data with epidemiological, clinical and genomic data in order to fully understand the relationship between genotype and phenotype and to drive discovery and the delivery of personalized medicine.
Resumo:
The design of a high-performance IIR (infinite impulse response) digital filter is described. The chip architecture operates on 11-b parallel, two's complement input data with a 12-b parallel two's complement coefficient to produce a 14-b two's complement output. The chip is implemented in 1.5-µm, double-layer-metal CMOS technology, consumes 0.5 W, and can operate up to 15 Msample/s. The main component of the system is a fine-grained systolic array that internally is based on a signed binary number representation (SBNR). Issues addressed include testing, clock distribution, and circuitry for conversion between two's complement and SBNR.
Resumo:
Phase and gain mismatches between the I and Q analog signal processing paths of a quadrature receiver are responsible for the generation of image signals which limit the dynamic range of a practical receiver. In this paper we analyse the effects these mismatches and propose a low-complexity blind adaptive algorithm to minimize this problem. The proposed solution is based on two, 2-tap adaptive filters, arranged in Adaptive Noise Canceller (ANC) set-up. The algorithm lends itself to efficient real-time implementation with minimal increase in modulator complexity.
Resumo:
I and Q Channel phase and gain misniatches are of great concern in communications receiver design. In this paper we analyse the effects of I and Q channel mismatches and propose a low-complexity blind adaptive algorithm to minimize this problem. The proposed solution consists of two, 2-tap adaptive filters, arranged in Adaptive Noise Canceller (ANC) set-up, with the output of one cross-fed to the input of the other. The system works as a de-correlator eliminating I and Q mismatch errors.
Resumo:
This paper deals with and details the design of a power-aware adaptive digital image rejection receiver based on blind-source-separation that alleviates the RF analog front-end impairments. Power-aware system design at the RTL level without having to redesign arithmetic circuits is used to reduce the power consumption in nomadic devices. Power-aware multipliers with configurable precision are used to trade-off the image-rejection-ratio (IRR) performance with power consumption. Results of the simulation case studies demonstrate that the IRR performance of the power-aware system is comparable to that of the normal implementation albeit degraded slightly, but well within the acceptable limits.
Resumo:
This paper considers the following question—where do computers, laptops and mobile phones come from and who produced them? Specific cases of digital labour are examined—the extraction of minerals in African mines under slave-like conditions; ICT manufacturing and assemblage in China (Foxconn); software engineering in India; call centre service work; software engineering at Google within Silicon Valley; and the digital labour of internet prosumers/users. Empirical data and empirical studies concerning these cases are systematically analysed and theoretically interpreted. The theoretical interpretations are grounded in Marxist political economy. The term ‘global value chain’ is criticised in favour of a complex and multidimensional understanding of Marx’s ‘mode of production’ for the purposes of conceptualizing digital labour. This kind of labour is transnational and involves various modes of production, relations of production and organisational forms (in the context of the productive forces). There is a complex global division of digital labour that connects and articulates various forms of productive forces, exploitation, modes of production, and variations within the dominant capitalist mode of production.
Resumo:
Dissertação para obtenção do grau de Mestre em Engenharia Electrotécnica Ramo de Automação e Electrónica Industrial
Resumo:
Dissertação para a obtenção do grau de Mestre em Engenharia Electrotécnica Ramo de Automação e Eletrónica Industrial
Resumo:
Com o consumismo de mais variedade e qualidade de informação, assim como, produtos interativos, surgiu a necessidade de apresentar mais conteúdos, para além da programação de televisão comum. Com os avanços tecnológicos ligados à indústria da televisão e sua distribuição nos lares portugueses pelos operadores de TV, a quantidade de oferta de canais deixou de ser um foco, passando a ser prioritário a melhoria da experiência do cliente. Com a introdução de novas funcionalidades nas caixas recetoras de sinais de transmissão de canais, como por exemplo, a capacidade de apresentar informações adicionais sobre os programas, desde da sua apresentação em modo trailer até ao elenco detalhado que o compõe, os clientes podem ter uma nova experiência de interação com os serviços de TV. A funcionalidade de gravação agendada de programas levou ao próximo ponto de melhoria de experiência do cliente. As gravações que resultavam em programas indevidamente cortados, quer no seu início quer no seu fim, foi um dos motivos que levou os operadores de TV a procurarem um melhor serviço de gestão de guias de programação digitais. A InfoPortugal, entidade detentora do seguinte projeto e EPG Provider de algumas operadoras de TV nacionais, viu-se obrigada a atualizar os seu sistemas de distribuição de conteúdos, para responder à evolução dos requisitos dos seus clientes.
Resumo:
Unilever Food Solutions new digital CRM1 Platform - What is the combination of tools, processes and content that will help Unilever Food Solutions grow his business? Unilever Food Solutions (UFS) intend to create a new online platform to enable it to communicate with segments of the markets, which have previously been too difficult to reach. Specifically targeted at Chefs and other food professionals, the aim is to create an interactive website, which delivers value to its intended users by providing a variety of relevant content and functions, while simultaneously opening up a potential transactional channel to those same users.
Resumo:
In 1903, the eastern slope of Turtle Mountain (Alberta) was affected by a 30 M m3-rockslide named Frank Slide that resulted in more than 70 casualties. Assuming that the main discontinuity sets, including bedding, control part of the slope morphology, the structural features of Turtle Mountain were investigated using a digital elevation model (DEM). Using new landscape analysis techniques, we have identified three main joint and fault sets. These results are in agreement with those sets identified through field observations. Landscape analysis techniques, using a DEM, confirm and refine the most recent geology model of the Frank Slide. The rockslide was initiated along bedding and a fault at the base of the slope and propagated up slope by a regressive process following a surface composed of pre-existing discontinuities. The DEM analysis also permits the identification of important geological structures along the 1903 slide scar. Based on the so called Sloping Local Base Level (SLBL) an estimation was made of the present unstable volumes in the main scar delimited by the cracks, and around the south area of the scar (South Peak). The SLBL is a method permitting a geometric interpretation of the failure surface based on a DEM. Finally we propose a failure mechanism permitting the progressive failure of the rock mass that considers gentle dipping wedges (30°). The prisms or wedges defined by two discontinuity sets permit the creation of a failure surface by progressive failure. Such structures are more commonly observed in recent rockslides. This method is efficient and is recommended as a preliminary analysis prior to field investigation.
Resumo:
An Ising-like model, with interactions ranging up to next-nearest-neighbor pairs, is used to simulate the process of interface alloying. Interactions are chosen to stabilize an intermediate "antiferromagnetic" ordered structure. The dynamics proceeds exclusively by atom-vacancy exchanges. In order to characterize the process, the time evolution of the width of the intermediate ordered region and the diffusion length is studied. Both lengths are found to follow a power-law evolution with exponents depending on the characteristic features of the model.
Resumo:
The rapid growth in high data rate communication systems has introduced new high spectral efficient modulation techniques and standards such as LTE-A (long term evolution-advanced) for 4G (4th generation) systems. These techniques have provided a broader bandwidth but introduced high peak-to-average power ratio (PAR) problem at the high power amplifier (HPA) level of the communication system base transceiver station (BTS). To avoid spectral spreading due to high PAR, stringent requirement on linearity is needed which brings the HPA to operate at large back-off power at the expense of power efficiency. Consequently, high power devices are fundamental in HPAs for high linearity and efficiency. Recent development in wide bandgap power devices, in particular AlGaN/GaN HEMT, has offered higher power level with superior linearity-efficiency trade-off in microwaves communication. For cost-effective HPA design to production cycle, rigorous computer aided design (CAD) AlGaN/GaN HEMT models are essential to reflect real response with increasing power level and channel temperature. Therefore, large-size AlGaN/GaN HEMT large-signal electrothermal modeling procedure is proposed. The HEMT structure analysis, characterization, data processing, model extraction and model implementation phases have been covered in this thesis including trapping and self-heating dispersion accounting for nonlinear drain current collapse. The small-signal model is extracted using the 22-element modeling procedure developed in our department. The intrinsic large-signal model is deeply investigated in conjunction with linearity prediction. The accuracy of the nonlinear drain current has been enhanced through several issues such as trapping and self-heating characterization. Also, the HEMT structure thermal profile has been investigated and corresponding thermal resistance has been extracted through thermal simulation and chuck-controlled temperature pulsed I(V) and static DC measurements. Higher-order equivalent thermal model is extracted and implemented in the HEMT large-signal model to accurately estimate instantaneous channel temperature. Moreover, trapping and self-heating transients has been characterized through transient measurements. The obtained time constants are represented by equivalent sub-circuits and integrated in the nonlinear drain current implementation to account for complex communication signals dynamic prediction. The obtained verification of this table-based large-size large-signal electrothermal model implementation has illustrated high accuracy in terms of output power, gain, efficiency and nonlinearity prediction with respect to standard large-signal test signals.