861 resultados para design processes
Resumo:
Transitions processes in higher education are characterized by new learning situations which pose challenges to most students. This chapter explores the heterogeneity of reactions to these challenges from a perspective of regulation processes. The Integrated Model of Learning and Action is used to identity different patterns of motivational regulation amongst students at university by using mixed distribution models. Six subpopulations of motivational regulation could be identified: students with self-determined, pragmatic, strategic, negative, anxious and insecure learning motivation. Findings about these patterns can be used to design didactic measures that will support students’ learning processes.
Resumo:
The power-law size distributions obtained experimentally for neuronal avalanches are an important evidence of criticality in the brain. This evidence is supported by the fact that a critical branching process exhibits the same exponent t~3=2. Models at criticality have been employed to mimic avalanche propagation and explain the statistics observed experimentally. However, a crucial aspect of neuronal recordings has been almost completely neglected in the models: undersampling. While in a typical multielectrode array hundreds of neurons are recorded, in the same area of neuronal tissue tens of thousands of neurons can be found. Here we investigate the consequences of undersampling in models with three different topologies (two-dimensional, small-world and random network) and three different dynamical regimes (subcritical, critical and supercritical). We found that undersampling modifies avalanche size distributions, extinguishing the power laws observed in critical systems. Distributions from subcritical systems are also modified, but the shape of the undersampled distributions is more similar to that of a fully sampled system. Undersampled supercritical systems can recover the general characteristics of the fully sampled version, provided that enough neurons are measured. Undersampling in two-dimensional and small-world networks leads to similar effects, while the random network is insensitive to sampling density due to the lack of a well-defined neighborhood. We conjecture that neuronal avalanches recorded from local field potentials avoid undersampling effects due to the nature of this signal, but the same does not hold for spike avalanches. We conclude that undersampled branching-process-like models in these topologies fail to reproduce the statistics of spike avalanches.
Resumo:
The valorization of glycerol has been widely studied notably due to the oversupply of the latter from biodiesel production. Among the different upgrading reactions, dehydration to acrolein is of high interest due to the importance of acrolein as an intermediate for polymer industry (via acrylic acid) and for feed additive (synthon for DL-methionine). It is known that acrolein can be obtained by glycerol catalytic dehydration over acid catalysts. Zeolites and heteropolyacid catalysts are initially highly active, but deactivate rapidly with time on stream by coking, whilst mixed metal oxides are more stable catalytic systems but less selective and in addition they require an activation period. In this talk, the strategy we followed is described. It consisted in a parallel approach in which we developed supported heteropolyacid-based catalysts with increased stability and acrolein selectivity by using a ZrO2-grafted SBA-15 playing the role of the support for silico-tungstic acid active phase, as well as a new concept based on a two zones fluidized bed reactor (TZFBR) to tackle the unavoidable deactivation issue of the HPA catalysts. This type of reactor comprises – in one single capacity – reaction and regeneration zones. In the second part of the lecture the REALCAT platform was introduced. REALCAT (French acronym standing for ‘Advanced High-Throughput Technologies Platform for Biorefineries Catalysts Design’) is an highly integrated platform devoted to the acceleration of innovation in all the fields of industrial catalysis with an emphasis on emergent biorefinery catalytic processes. In this extremely competitive field, REALCAT consists in a versatile High-Throughput Technologies (HTT) platform devoted to innovation in heterogeneous, homogeneous or biocatalysts AND their combinations under the ultra-efficient very novel concept of hybrid catalysis.
Resumo:
Part 20: Health and Care Networks
Resumo:
Background. Teachers’ legitimacy is central to school functioning. Teachers’ justice, whether distributive or procedural, predicts teachers’ legitimacy. Aims. What is still do be found, and constitutes the goal of this paper, is whether unjust treatment by a teacher affects the legitimacy of the teacher differently when the student knows that the teacher was fair to a peer (comparative judgement) or when the student does not have that information (autonomous judgement). Samples. A total of 79 high school students participated in Study 1; 75 high school students participated in Study 2. Methods. Two experimental studies with a 2 justice valence (just, unjust) 9 2 social comparison processes (autonomous judgements, comparative judgements) betweenparticipants design were conducted. Study 1 addressed distributive justice and Study 2 addressed procedural justice. The dependent variable was teachers’ legitimacy. Results. In both studies, situations perceived as just led to higher teachers’ legitimacy than situations perceived as unjust. For the distributive injustice conditions, teachers’ legitimacy was equally lower for autonomous judgement and comparative judgement conditions. For procedural injustice, teachers’ legitimacy was lower when the peer was treated justly and the participant was treated unfairly, compared with the condition when the participants did not know how the teacher treated the peer. Conclusions. We conclude that teachers’ injustice affects teachers’ legitimacy, but it does it differently according to the social comparisons involved and the type of justice involved. Moreover, these results highlight that social comparisons are an important psychological process and, therefore, they should be taken into account in models of justice.
Resumo:
Biogeochemical-Argo is the extension of the Argo array of profiling floats to include floats that are equipped with biogeochemical sensors for pH, oxygen, nitrate, chlorophyll, suspended particles, and downwelling irradiance. Argo is a highly regarded, international program that measures the changing ocean temperature (heat content) and salinity with profiling floats distributed throughout the ocean. Newly developed sensors now allow profiling floats to also observe biogeochemical properties with sufficient accuracy for climate studies. This extension of Argo will enable an observing system that can determine the seasonal to decadal-scale variability in biological productivity, the supply of essential plant nutrients from deep-waters to the sunlit surface layer, ocean acidification, hypoxia, and ocean uptake of CO2. Biogeochemical-Argo will drive a transformative shift in our ability to observe and predict the effects of climate change on ocean metabolism, carbon uptake, and living marine resource management. Presently, vast areas of the open ocean are sampled only once per decade or less, with sampling occurring mainly in summer. Our ability to detect changes in biogeochemical processes that may occur due to the warming and acidification driven by increasing atmospheric CO2, as well as by natural climate variability, is greatly hindered by this undersampling. In close synergy with satellite systems (which are effective at detecting global patterns for a few biogeochemical parameters, but only very close to the sea surface and in the absence of clouds), a global array of biogeochemical sensors would revolutionize our understanding of ocean carbon uptake, productivity, and deoxygenation. The array would reveal the biological, chemical, and physical events that control these processes. Such a system would enable a new generation of global ocean prediction systems in support of carbon cycling, acidification, hypoxia and harmful algal blooms studies, as well as the management of living marine resources. In order to prepare for a global Biogeochemical-Argo array, several prototype profiling float arrays have been developed at the regional scale by various countries and are now operating. Examples include regional arrays in the Southern Ocean (SOCCOM ), the North Atlantic Sub-polar Gyre (remOcean ), the Mediterranean Sea (NAOS ), the Kuroshio region of the North Pacific (INBOX ), and the Indian Ocean (IOBioArgo ). For example, the SOCCOM program is deploying 200 profiling floats with biogeochemical sensors throughout the Southern Ocean, including areas covered seasonally with ice. The resulting data, which are publically available in real time, are being linked with computer models to better understand the role of the Southern Ocean in influencing CO2 uptake, biological productivity, and nutrient supply to distant regions of the world ocean. The success of these regional projects has motivated a planning meeting to discuss the requirements for and applications of a global-scale Biogeochemical-Argo program. The meeting was held 11-13 January 2016 in Villefranche-sur-Mer, France with attendees from eight nations now deploying Argo floats with biogeochemical sensors present to discuss this topic. In preparation, computer simulations and a variety of analyses were conducted to assess the resources required for the transition to a global-scale array. Based on these analyses and simulations, it was concluded that an array of about 1000 biogeochemical profiling floats would provide the needed resolution to greatly improve our understanding of biogeochemical processes and to enable significant improvement in ecosystem models. With an endurance of four years for a Biogeochemical-Argo float, this system would require the procurement and deployment of 250 new floats per year to maintain a 1000 float array. The lifetime cost for a Biogeochemical-Argo float, including capital expense, calibration, data management, and data transmission, is about $100,000. A global Biogeochemical-Argo system would thus cost about $25,000,000 annually. In the present Argo paradigm, the US provides half of the profiling floats in the array, while the EU, Austral/Asia, and Canada share most the remaining half. If this approach is adopted, the US cost for the Biogeochemical-Argo system would be ~$12,500,000 annually and ~$6,250,000 each for the EU, and Austral/Asia and Canada. This includes no direct costs for ship time and presumes that float deployments can be carried out from future research cruises of opportunity, including, for example, the international GO-SHIP program (http://www.go-ship.org). The full-scale implementation of a global Biogeochemical-Argo system with 1000 floats is feasible within a decade. The successful, ongoing pilot projects have provided the foundation and start for such a system.
Resumo:
“Seeing is believing” the proverb well suits for fluorescent imaging probes. Since we can selectively and sensitively visualize small biomolecules, organelles such as lysosomes, neutral molecules, metal ions, anions through cellular imaging, fluorescent probes can help shed light on the physiological and pathophysiological path ways. Since these biomolecules are produced in low concentrations in the biochemical pathways, general analytical techniques either fail to detect or are not sensitive enough to differentiate the relative concentrations. During my Ph.D. study, I exploited synthetic organic techniques to design and synthesize fluorescent probes with desirable properties such as high water solubility, high sensitivity and with varying fluorescent quantum yields. I synthesized a highly water soluble BOIDPY-based turn-on fluorescent probe for endogenous nitric oxide. I also synthesized a series of cell membrane permeable near infrared (NIR) pH activatable fluorescent probes for lysosomal pH sensing. Fluorescent dyes are molecular tools for designing fluorescent bio imaging probes. This prompted me to design and synthesize a hybrid fluorescent dye with a functionalizable chlorine atom and tested the chlorine re-activity for fluorescent probe design. Carbohydrate and protein interactions are key for many biological processes, such as viral and bacterial infections, cell recognition and adhesion, and immune response. Among several analytical techniques aimed to study these interactions, electrochemical bio sensing is more efficient due to its low cost, ease of operation, and possibility for miniaturization. During my Ph.D., I synthesized mannose bearing aniline molecule which is successfully tested as electrochemical bio sensor. A Ferrocene-mannose conjugate with an anchoring group is synthesized, which can be used as a potential electrochemical biosensor.
Resumo:
Biochemical processes by chemoautotrophs such as nitrifiers and sulfide and iron oxidizers are used extensively in wastewater treatment. The research described in this dissertation involved the study of two selected biological processes utilized in wastewater treatment mediated by chemoautotrophic bacteria: nitrification (biological removal of ammonia and nitrogen) and hydrogen sulfide (H2S) removal from odorous air using biofiltration. A municipal wastewater treatment plant (WWTP) receiving industrial dyeing discharge containing the azo dye, acid black 1 (AB1) failed to meet discharge limits, especially during the winter. Dyeing discharge mixed with domestic sewage was fed to sequencing batch reactors at 22oC and 7oC. Complete nitrification failure occurred at 7oC with more rapid nitrification failure as the dye concentration increased; slight nitrification inhibition occurred at 22oC. Dye-bearing wastewater reduced chemical oxygen demand (COD) removal at 7oC and 22oC, increased i effluent total suspended solids (TSS) at 7oC, and reduced activated sludge quality at 7oC. Decreasing AB1 loading resulted in partial nitrification recovery. Eliminating the dye-bearing discharge to the full-scale WWTP led to improved performance bringing the WWTP into regulatory compliance. BiofilterTM, a dynamic model describing the biofiltration processes for hydrogen sulfide removal from odorous air emissions, was calibrated and validated using pilot- and full-scale biofilter data. In addition, the model predicted the trend of the measured data under field conditions of changing input concentration and low effluent concentrations. The model demonstrated that increasing gas residence time and temperature and decreasing influent concentration decreases effluent concentration. Model simulations also showed that longer residence times are required to treat loading spikes. BiofilterTM was also used in the preliminary design of a full-scale biofilter for the removal of H2S from odorous air. Model simulations illustrated that plots of effluent concentration as a function of residence time or bed area were useful to characterize and design biofilters. Also, decreasing temperature significantly increased the effluent concentration. Model simulations showed that at a given temperature, a biofilter cannot reduce H2S emissions below a minimum value, no matter how large the biofilter.
Resumo:
The cemeterial units, are places of social practices of everyday life and worship and the tomb where nostalgia can be externalized and the memory of the deceased revered. In Western societies we can find a category of artifacts meant to evoke the memory or honor the dead. In this paper we we mention three examples of products that enabled a reflection on the concepts that gave rise to their ways, and that risks to fit them into a new "material culture", in that it may have created a break with the traditional system codes and standards shared by companies, and its manifestations in relation to the physical creation of this category of products. This work offers a reflection on the Design Products.What probably makes it special is the field where it is located: the design of products in one post mortem memory. Usually made of granite rock or marble, have the form of plate or tablet, open book or rolled sheet. On one side have a photograph of the person who intend to honor and inscriptions. The thought of inherent design of this work put on one side the intricate set of emotions that this type of product can generate, and other components more affordable, and concerning the form, function and object interactions with users and with use environments. In the definition of the problem it was regarded as mandatory requirements: differentiation, added value and durability as key objectives.The first two should be manifested in the various components / product attributes. The aesthetic and material/structural durability of product necessarily imply the introduction of qualifying terms and quantitative weights, which positively influence the generation and evaluation of concepts based on the set of 10 principles for the project that originated a matrix as a tool to aid designing products. The concrete definition of a target audience was equally important. At this stage, the collaboration of other experts in the fields of psychology and sociology as disciplines with particular ability to understand individuals and social phenomena respectively was crucial. It was concluded that a product design to honor someone post mortem, should abandon the more traditional habits and customs to focus on identifying new audiences. Although at present it can be considered a niche market, it is believed that in the future may grow as well as their interest in this type of products.
Resumo:
Cassava contributes significantly to biobased material development. Conventional approaches for its bio-derivative-production and application cause significant wastes, tailored material development challenges, with negative environmental impact and application limitations. Transforming cassava into sustainable value-added resources requires redesigning new approaches. Harnessing unexplored material source, and downstream process innovations can mitigate challenges. The ultimate goal proposed an integrated sustainable process system for cassava biomaterial development and potential application. An improved simultaneous release recovery cyanogenesis (SRRC) methodology, incorporating intact bitter cassava, was developed and standardized. Films were formulated, characterised, their mass transport behaviour, simulating real-distribution-chain conditions quantified, and optimised for desirable properties. Integrated process design system, for sustainable waste-elimination and biomaterial development, was developed. Films and bioderivatives for desired MAP, fast-delivery nutraceutical excipients and antifungal active coating applications were demonstrated. SRRC-processed intact bitter cassava produced significantly higher yield safe bio-derivatives than peeled, guaranteeing 16% waste-elimination. Process standardization transformed entire root into higher yield and clarified colour bio-derivatives and efficient material balance at optimal global desirability. Solvent mass through temperature-humidity-stressed films induced structural changes, and influenced water vapour and oxygen permeability. Sevenunit integrated-process design led to cost-effectiveness, energy-efficient and green cassava processing and biomaterials with zero-environment footprints. Desirable optimised bio-derivatives and films demonstrated application in desirable in-package O2/CO2, mouldgrowth inhibition, faster tablet excipient nutraceutical dissolutions and releases, and thymolencapsulated smooth antifungal coatings. Novel material resources, non-root peeling, zero-waste-elimination, and desirable standardised methodology present promising process integration tools for sustainable cassava biobased system development. Emerging design outcomes have potential applications to mitigate cyanide challenges and provide bio-derivative development pathways. Process system leads to zero-waste, with potential to reshape current style one-way processes into circular designs modelled on nature's effective approaches. Indigenous cassava components as natural material reinforcements, and SRRC processing approach has initiated a process with potential wider deployment in broad product research development. This research contributes to scientific knowledge in material science and engineering process design.
Resumo:
Shearing is the process where sheet metal is mechanically cut between two tools. Various shearing technologies are commonly used in the sheet metal industry, for example, in cut to length lines, slitting lines, end cropping etc. Shearing has speed and cost advantages over competing cutting methods like laser and plasma cutting, but involves large forces on the equipment and large strains in the sheet material. The constant development of sheet metals toward higher strength and formability leads to increased forces on the shearing equipment and tools. Shearing of new sheet materials imply new suitable shearing parameters. Investigations of the shearing parameters through live tests in the production are expensive and separate experiments are time consuming and requires specialized equipment. Studies involving a large number of parameters and coupled effects are therefore preferably performed by finite element based simulations. Accurate experimental data is still a prerequisite to validate such simulations. There is, however, a shortage of accurate experimental data to validate such simulations. In industrial shearing processes, measured forces are always larger than the actual forces acting on the sheet, due to friction losses. Shearing also generates a force that attempts to separate the two tools with changed shearing conditions through increased clearance between the tools as result. Tool clearance is also the most common shearing parameter to adjust, depending on material grade and sheet thickness, to moderate the required force and to control the final sheared edge geometry. In this work, an experimental procedure that provides a stable tool clearance together with accurate measurements of tool forces and tool displacements, was designed, built and evaluated. Important shearing parameters and demands on the experimental set-up were identified in a sensitivity analysis performed with finite element simulations under the assumption of plane strain. With respect to large tool clearance stability and accurate force measurements, a symmetric experiment with two simultaneous shears and internal balancing of forces attempting to separate the tools was constructed. Steel sheets of different strength levels were sheared using the above mentioned experimental set-up, with various tool clearances, sheet clamping and rake angles. Results showed that tool penetration before fracture decreased with increased material strength. When one side of the sheet was left unclamped and free to move, the required shearing force decreased but instead the force attempting to separate the two tools increased. Further, the maximum shearing force decreased and the rollover increased with increased tool clearance. Digital image correlation was applied to measure strains on the sheet surface. The obtained strain fields, together with a material model, were used to compute the stress state in the sheet. A comparison, up to crack initiation, of these experimental results with corresponding results from finite element simulations in three dimensions and at a plane strain approximation showed that effective strains on the surface are representative also for the bulk material. A simple model was successfully applied to calculate the tool forces in shearing with angled tools from forces measured with parallel tools. These results suggest that, with respect to tool forces, a plane strain approximation is valid also at angled tools, at least for small rake angles. In general terms, this study provide a stable symmetric experimental set-up with internal balancing of lateral forces, for accurate measurements of tool forces, tool displacements, and sheet deformations, to study the effects of important shearing parameters. The results give further insight to the strain and stress conditions at crack initiation during shearing, and can also be used to validate models of the shearing process.
Resumo:
This thesis introduce a new innovation methodology called IDEAS(R)EVOLUTION that was developed according to an on-going experimental research project started in 2007. This new approach to innovation has initial based on Design thinking for innovation theory and practice. The concept of design thinking for innovation has received much attention in recent years. This innovation approach has climbed from the design and designers knowledge field towards other knowledge areas, mainly business management and marketing. Human centered approach, radical collaboration, creativity and breakthrough thinking are the main founding principles of Design thinking that were adapted by those knowledge areas due to their assertively and fitness to the business context and market complexity evolution. Also Open innovation, User-centered innovation and later on Living Labs models emerge as answers to the market and consumers pressure and desire for new products, new services or new business models. Innovation became the principal business management focus and strategic orientation. All this changes had an impact also in the marketing theory. It is possible now to have better strategies, communications plans and continuous dialogue systems with the target audience, incorporating their insights and promoting them to the main dissemination ambassadors of our innovations in the market. Drawing upon data from five case studies, the empirical findings in this dissertation suggest that companies need to shift from Design thinking for innovation approach to an holistic, multidimensional and integrated innovation system. The innovation context it is complex, companies need deeper systems then the success formulas that “commercial “Design thinking for innovation “preaches”. They need to learn how to change their organization culture, how to empower their workforce and collaborators, how to incorporate external stakeholders in their innovation processes, hoe to measure and create key performance indicators throughout the innovation process to give them better decision making data, how to integrate meaning and purpose in their innovation philosophy. Finally they need to understand that the strategic innovation effort it is not a “one shot” story it is about creating a continuous flow of interaction and dialogue with their clients within a “value creation chain“ mindset; RESUMO: Metodologia de co-criação de um produto/marca cruzando Marketing, Design Thinking, Criativity and Management - IDEAS(R)EVOLUTION. Esta dissertação apresenta uma nova metodologia de inovação chamada IDEAS(R)EVOLUTION, que foi desenvolvida segundo um projecto de investigação experimental contínuo que teve o seu início em 2007. Esta nova abordagem baseou-se, inicialmente, na teoria e na práctica do Design thinking para a inovação. Actualmente o conceito do Design Thinking para a inovação “saiu” do dominio da area de conhecimento do Design e dos Designers, tendo despertado muito interesse noutras áreas como a Gestão e o Marketing. Uma abordagem centrada na Pessoa, a colaboração radical, a criatividade e o pensamento disruptivo são principios fundadores do movimento do Design thinking que têm sido adaptados por essas novas áreas de conhecimento devido assertividade e adaptabilidade ao contexto dos negócios e à evolução e complexidade do Mercado. Também os modelos de Inovação Aberta, a inovação centrada no utilizador e mais tarde os Living Labs, emergem como possiveis soluções para o Mercado e para a pressão e desejo dos consumidores para novos productos, serviços ou modelos de negócio. A inovação passou a ser o principal foco e orientação estratégica na Gestão. Todas estas mudanças também tiveram impacto na teoria do Marketing. Hoje é possivel criar melhores estratégias, planos de comunicação e sistemas continuos de diálogo com o público alvo, incorporando os seus insights e promovendo os consumidores como embaixadores na disseminação da inovação das empresas no Mercado Os resultados empiricos desta tese, construídos com a informação obtida nos cinco casos realizados, sugerem que as empresas precisam de se re-orientar do paradigma do Design thinking para a inovação, para um sistema de inovação mais holistico, multidimensional e integrado. O contexto da Inovação é complexo, por isso as empresas precisam de sistemas mais profundos e não apenas de “fórmulas comerciais” como o Design thinking para a inovação advoga. As Empresas precisam de aprender como mudar a sua cultura organizacional, como capacitar sua força de trabalho e colaboradores, como incorporar os públicos externos no processo de inovação, como medir o processo de inovação criando indicadores chave de performance e obter dados para um tomada de decisão mais informada, como integrar significado e propósito na sua filosofia de inovação. Por fim, precisam de perceber que uma estratégia de inovação não passa por ter “sucesso uma vez”, mas sim por criar um fluxo contínuo de interação e diálogo com os seus clientes com uma mentalidade de “cadeia de criação de valor”
Resumo:
The interdisciplinary relationship between industrial design and mechanical engineering is sensitive. This research focuses on understanding how one can positively mediate this relation, in order to foster innovation. In this paper, technology is considered for this role since it has, in some historical moments, served as an integrator of these two disciplines, in processes that led to innovation. By means of an extensive literature review, covering three different periods of technological development, both disciplines’ positioning in society and their link with technology are analyzed and compared. The three case studies selected help to illustrate, precisely, the technology positioning between both disciplines and society. Literature assumes that industrial design is rooted in the rise of criticism against both the machine and the mechanized production. This is an opposing approach to the current paradigm, in which design plays a fundamental role in adapting technology to society. Also, the social problems caused by the mechanized and massive production triggered the mechanical engineering emergence, as a professionalized discipline. Technology was intrinsically connected with both industrial design and mechanical engineering emergence and subsequent evolution. In the technology conflict with society lays the reform and regulation for design practice, in its broadest sense.