901 resultados para design or documentation process


Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study took place at one of the intercultural universities (IUs) of Mexico that serve primarily indigenous students. The IUs are pioneers in higher education despite their numerous challenges (Bertely, 1998; Dietz, 2008; Pineda & Landorf, 2010; Schmelkes, 2009). To overcome educational inequalities among their students (Ahuja, Berumen, Casillas, Crispín, Delgado et al., 2004; Schmelkes, 2009), the IUs have embraced performance-based assessment (PBA; Casillas & Santini, 2006). PBA allows a shared model of power and control related to learning and evaluation (Anderson, 1998). While conducting a review on PBA strategies of the IUs, the researcher did not find a PBA instrument with valid and reliable estimates. The purpose of this study was to develop a process to create a PBA instrument, an analytic general rubric, with acceptable validity and reliability estimates to assess students' attainment of competencies in one of the IU's majors, Intercultural Development Management. The Human Capabilities Approach (HCA) was the theoretical framework and a sequential mixed method (Creswell, 2003; Teddlie & Tashakkori, 2009) was the research design. IU participants created a rubric during two focus groups, and seven Spanish-speaking professors in Mexico and the US piloted using students' research projects. The evidence that demonstrates the attainment of competencies at the IU is a complex set of actual, potential and/or desired performances or achievements, also conceptualized as "functional capabilities" (FCs; Walker, 2008), that can be used to develop a rubric. Results indicate that the rubric's validity and reliability estimates reached acceptable estimates of 80% agreement, surpassing minimum requirements (Newman, Newman, & Newman, 2011). Implications for practice involve the use of PBA within a formative assessment framework, and dynamic inclusion of constituencies. Recommendations for further research include introducing this study's instrument-development process to other IUs, conducting parallel mixed design studies exploring the intersection between HCA and assessment, and conducting a case study exploring assessment in intercultural settings. Education articulated through the HCA empowers students (Unterhalter & Brighouse, 2007; Walker, 2008). This study aimed to contribute to the quality of student learning assessment at the IUs by providing a participatory process to develop a PBA instrument.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Today, modern System-on-a-Chip (SoC) systems have grown rapidly due to the increased processing power, while maintaining the size of the hardware circuit. The number of transistors on a chip continues to increase, but current SoC designs may not be able to exploit the potential performance, especially with energy consumption and chip area becoming two major concerns. Traditional SoC designs usually separate software and hardware. Thus, the process of improving the system performance is a complicated task for both software and hardware designers. The aim of this research is to develop hardware acceleration workflow for software applications. Thus, system performance can be improved with constraints of energy consumption and on-chip resource costs. The characteristics of software applications can be identified by using profiling tools. Hardware acceleration can have significant performance improvement for highly mathematical calculations or repeated functions. The performance of SoC systems can then be improved, if the hardware acceleration method is used to accelerate the element that incurs performance overheads. The concepts mentioned in this study can be easily applied to a variety of sophisticated software applications. The contributions of SoC-based hardware acceleration in the hardware-software co-design platform include the following: (1) Software profiling methods are applied to H.264 Coder-Decoder (CODEC) core. The hotspot function of aimed application is identified by using critical attributes such as cycles per loop, loop rounds, etc. (2) Hardware acceleration method based on Field-Programmable Gate Array (FPGA) is used to resolve system bottlenecks and improve system performance. The identified hotspot function is then converted to a hardware accelerator and mapped onto the hardware platform. Two types of hardware acceleration methods – central bus design and co-processor design, are implemented for comparison in the proposed architecture. (3) System specifications, such as performance, energy consumption, and resource costs, are measured and analyzed. The trade-off of these three factors is compared and balanced. Different hardware accelerators are implemented and evaluated based on system requirements. 4) The system verification platform is designed based on Integrated Circuit (IC) workflow. Hardware optimization techniques are used for higher performance and less resource costs. Experimental results show that the proposed hardware acceleration workflow for software applications is an efficient technique. The system can reach 2.8X performance improvements and save 31.84% energy consumption by applying the Bus-IP design. The Co-processor design can have 7.9X performance and save 75.85% energy consumption.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The study of the private management of public housing is an important topic to be critically analyzed as the government search for ways to increase efficiency in providing housing for the poor. Public Housing Authorities must address the cost for repairing or replacing the deteriorating housing stock, the increase in the need for affordable housing, and the lack of supply. There is growing pressure on efficient use of public funds that has heightened the need for profound structural reform. An important strategy for carrying out such reform is through privatization. Although privatization does not work in every case, the majority position in the traditional privatization literature is that reliance on private organizations normally, but not always, results in cost savings. ^ The primary purpose of this dissertation is to determine whether a consensus exist among decision-makers on the efficiency of privatizing the management of public housing. A secondary purpose is to review the techniques (best practices) used by the private sector that results in cost-efficiencies in the management of public housing. The study employs the use of a triangulated research design utilizing cross-sectional survey methodology that included use of a survey instrument to solicit responses from the private managers. The study consists of qualitative methods using interviews from key informants of private-sector management firms and public housing agencies, case studies, focus groups, archival records and housing authorities documents. ^ Results indicated that important decision-makers perceive that private managers made a positive contribution to cost-efficiencies in the management of public housing. The performance of private contractors served as a yardstick for comparison of efficiency of services that are produced in-house. The study concluded that private managers made the benefits of their management techniques well known creating a sense of competition between public and private managers. Competition from private contractors spurred municipal worker and management productivity improvements creating better management results for the public housing authorities. The study results are in concert with a review of recent research and studies that also concluded private managers have some distinct advantages to controlling costs in the management of public housing. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The purpose of this research was to design and implement a Series of Latin Shows to be featured at the Satine Restaurant located in The Diplomat Hotel in Hollywood, Florida. Three shows were created: "Electro Tango," "Bossa Nova Jazz," and "Piel Canela Night" to help generate interest for not only the Satine Restaurant but also for the surrounding area. The artistic concept included big bands, costumes, dancers and a DJ. A production book was created and included the most important aspects of the individual shows such as budgets, costumes, and ground plans, to assure the success of each event. Careful analysis was done for the demographic area and a marketing plan was designed and implemented. The research and practical application of similar shows in the industry determined that the production of these particular shows, although costly, have a qualifiable chance to succeed in this venue.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Database design is a difficult problem for non-expert designers. It is desirable to assist such designers during the problem solving process by means of a knowledge based (KB) system. Although a number of prototype KB systems have been proposed, there are many shortcomings. Firstly, few have incorporated sufficient expertise in modeling relationships, particularly higher order relationships. Secondly, there does not seem to be any published empirical study that experimentally tested the effectiveness of any of these KB tools. Thirdly, problem solving behavior of non-experts, whom the systems were intended to assist, has not been one of the bases for system design. In this project, a consulting system, called CODA, for conceptual database design that addresses the above short comings was developed and empirically validated. More specifically, the CODA system incorporates (a) findings on why non-experts commit errors and (b) heuristics for modeling relationships. Two approaches to knowledge base implementation were used and compared in this project, namely system restrictiveness and decisional guidance (Silver 1990). The Restrictive system uses a proscriptive approach and limits the designer's choices at various design phases by forcing him/her to follow a specific design path. The Guidance system approach, which is less restrictive, involves providing context specific, informative and suggestive guidance throughout the design process. Both the approaches would prevent erroneous design decisions. The main objectives of the study are to evaluate (1) whether the knowledge-based system is more effective than the system without a knowledge-base and (2) which approach to knowledge implementation - whether Restrictive or Guidance - is more effective. To evaluate the effectiveness of the knowledge base itself, the systems were compared with a system that does not incorporate the expertise (Control). An experimental procedure using student subjects was used to test the effectiveness of the systems. The subjects solved a task without using the system (pre-treatment task) and another task using one of the three systems, viz. Control, Guidance or Restrictive (experimental task). Analysis of experimental task scores of those subjects who performed satisfactorily in the pre-treatment task revealed that the knowledge based approach to database design support lead to more accurate solutions than the control system. Among the two KB approaches, Guidance approach was found to lead to better performance when compared to the Control system. It was found that the subjects perceived the Restrictive system easier to use than the Guidance system.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The architect materializes his ideas using architectural representations that acts differently during the design process, as instrument that expresses his creatives ideas, as communication between the designer and the client, or as project documentation for its execution (DURAND, 2003). In this paper, it’s been discussed the connexion between the architectural representations and the design process, in a professional context, focusing on representation as an aid to conception. The general aim is to understand the role of architectural representations in the design process by identifying ways of appropriation of their types and resources. The investigation was developed through the theoretical and conceptual studies about the mentioned themes, and the empirical and qualitative research, with architects from the state of Rio Grande do Norte, Brazil, which was developed in two stages: the first one, by filling an electronic form, and the second one, by case studies through execution of design exercises. The results of indirect research showed that the majority of architects and urbanists believes that the way it use the types and representation resources may interfere in design concept. And, after the completion of the case studies, was showed that, motivated by different design conditions, most designers has used the representations differently, which is reflected in different design conceptions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The main focus of this thesis is to address the relative localization problem of a heterogenous team which comprises of both ground and micro aerial vehicle robots. This team configuration allows to combine the advantages of increased accessibility and better perspective provided by aerial robots with the higher computational and sensory resources provided by the ground agents, to realize a cooperative multi robotic system suitable for hostile autonomous missions. However, in such a scenario, the strict constraints in flight time, sensor pay load, and computational capability of micro aerial vehicles limits the practical applicability of popular map-based localization schemes for GPS denied navigation. Therefore, the resource limited aerial platforms of this team demand simpler localization means for autonomous navigation. Relative localization is the process of estimating the formation of a robot team using the acquired inter-robot relative measurements. This allows the team members to know their relative formation even without a global localization reference, such as GPS or a map. Thus a typical robot team would benefit from a relative localization service since it would allow the team to implement formation control, collision avoidance, and supervisory control tasks, independent of a global localization service. More importantly, a heterogenous team such as ground robots and computationally constrained aerial vehicles would benefit from a relative localization service since it provides the crucial localization information required for autonomous operation of the weaker agents. This enables less capable robots to assume supportive roles and contribute to the more powerful robots executing the mission. Hence this study proposes a relative localization-based approach for ground and micro aerial vehicle cooperation, and develops inter-robot measurement, filtering, and distributed computing modules, necessary to realize the system. The research study results in three significant contributions. First, the work designs and validates a novel inter-robot relative measurement hardware solution which has accuracy, range, and scalability characteristics, necessary for relative localization. Second, the research work performs an analysis and design of a novel nonlinear filtering method, which allows the implementation of relative localization modules and attitude reference filters on low cost devices with optimal tuning parameters. Third, this work designs and validates a novel distributed relative localization approach, which harnesses the distributed computing capability of the team to minimize communication requirements, achieve consistent estimation, and enable efficient data correspondence within the network. The work validates the complete relative localization-based system through multiple indoor experiments and numerical simulations. The relative localization based navigation concept with its sensing, filtering, and distributed computing methods introduced in this thesis complements system limitations of a ground and micro aerial vehicle team, and also targets hostile environmental conditions. Thus the work constitutes an essential step towards realizing autonomous navigation of heterogenous teams in real world applications.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The successful, efficient, and safe turbine design requires a thorough understanding of the underlying physical phenomena. This research investigates the physical understanding and parameters highly correlated to flutter, an aeroelastic instability prevalent among low pressure turbine (LPT) blades in both aircraft engines and power turbines. The modern way of determining whether a certain cascade of LPT blades is susceptible to flutter is through time-expensive computational fluid dynamics (CFD) codes. These codes converge to solution satisfying the Eulerian conservation equations subject to the boundary conditions of a nodal domain consisting fluid and solid wall particles. Most detailed CFD codes are accompanied by cryptic turbulence models, meticulous grid constructions, and elegant boundary condition enforcements all with one goal in mind: determine the sign (and therefore stability) of the aerodynamic damping. The main question being asked by the aeroelastician, ``is it positive or negative?'' This type of thought-process eventually gives rise to a black-box effect, leaving physical understanding behind. Therefore, the first part of this research aims to understand and reveal the physics behind LPT flutter in addition to several related topics including acoustic resonance effects. A percentage of this initial numerical investigation is completed using an influence coefficient approach to study the variation the work-per-cycle contributions of neighboring cascade blades to a reference airfoil. The second part of this research introduces new discoveries regarding the relationship between steady aerodynamic loading and negative aerodynamic damping. Using validated CFD codes as computational wind tunnels, a multitude of low-pressure turbine flutter parameters, such as reduced frequency, mode shape, and interblade phase angle, will be scrutinized across various airfoil geometries and steady operating conditions to reach new design guidelines regarding the influence of steady aerodynamic loading and LPT flutter. Many pressing topics influencing LPT flutter including shocks, their nonlinearity, and three-dimensionality are also addressed along the way. The work is concluded by introducing a useful preliminary design tool that can estimate within seconds the entire aerodynamic damping versus nodal diameter curve for a given three-dimensional cascade.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A RET network consists of a network of photo-active molecules called chromophores that can participate in inter-molecular energy transfer called resonance energy transfer (RET). RET networks are used in a variety of applications including cryptographic devices, storage systems, light harvesting complexes, biological sensors, and molecular rulers. In this dissertation, we focus on creating a RET device called closed-diffusive exciton valve (C-DEV) in which the input to output transfer function is controlled by an external energy source, similar to a semiconductor transistor like the MOSFET. Due to their biocompatibility, molecular devices like the C-DEVs can be used to introduce computing power in biological, organic, and aqueous environments such as living cells. Furthermore, the underlying physics in RET devices are stochastic in nature, making them suitable for stochastic computing in which true random distribution generation is critical.

In order to determine a valid configuration of chromophores for the C-DEV, we developed a systematic process based on user-guided design space pruning techniques and built-in simulation tools. We show that our C-DEV is 15x better than C-DEVs designed using ad hoc methods that rely on limited data from prior experiments. We also show ways in which the C-DEV can be improved further and how different varieties of C-DEVs can be combined to form more complex logic circuits. Moreover, the systematic design process can be used to search for valid chromophore network configurations for a variety of RET applications.

We also describe a feasibility study for a technique used to control the orientation of chromophores attached to DNA. Being able to control the orientation can expand the design space for RET networks because it provides another parameter to tune their collective behavior. While results showed limited control over orientation, the analysis required the development of a mathematical model that can be used to determine the distribution of dipoles in a given sample of chromophore constructs. The model can be used to evaluate the feasibility of other potential orientation control techniques.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

While molecular and cellular processes are often modeled as stochastic processes, such as Brownian motion, chemical reaction networks and gene regulatory networks, there are few attempts to program a molecular-scale process to physically implement stochastic processes. DNA has been used as a substrate for programming molecular interactions, but its applications are restricted to deterministic functions and unfavorable properties such as slow processing, thermal annealing, aqueous solvents and difficult readout limit them to proof-of-concept purposes. To date, whether there exists a molecular process that can be programmed to implement stochastic processes for practical applications remains unknown.

In this dissertation, a fully specified Resonance Energy Transfer (RET) network between chromophores is accurately fabricated via DNA self-assembly, and the exciton dynamics in the RET network physically implement a stochastic process, specifically a continuous-time Markov chain (CTMC), which has a direct mapping to the physical geometry of the chromophore network. Excited by a light source, a RET network generates random samples in the temporal domain in the form of fluorescence photons which can be detected by a photon detector. The intrinsic sampling distribution of a RET network is derived as a phase-type distribution configured by its CTMC model. The conclusion is that the exciton dynamics in a RET network implement a general and important class of stochastic processes that can be directly and accurately programmed and used for practical applications of photonics and optoelectronics. Different approaches to using RET networks exist with vast potential applications. As an entropy source that can directly generate samples from virtually arbitrary distributions, RET networks can benefit applications that rely on generating random samples such as 1) fluorescent taggants and 2) stochastic computing.

By using RET networks between chromophores to implement fluorescent taggants with temporally coded signatures, the taggant design is not constrained by resolvable dyes and has a significantly larger coding capacity than spectrally or lifetime coded fluorescent taggants. Meanwhile, the taggant detection process becomes highly efficient, and the Maximum Likelihood Estimation (MLE) based taggant identification guarantees high accuracy even with only a few hundred detected photons.

Meanwhile, RET-based sampling units (RSU) can be constructed to accelerate probabilistic algorithms for wide applications in machine learning and data analytics. Because probabilistic algorithms often rely on iteratively sampling from parameterized distributions, they can be inefficient in practice on the deterministic hardware traditional computers use, especially for high-dimensional and complex problems. As an efficient universal sampling unit, the proposed RSU can be integrated into a processor / GPU as specialized functional units or organized as a discrete accelerator to bring substantial speedups and power savings.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The purpose of this research is to examine the use of a mock-up review process in interior design projects to better understand the implications of using such a process within the standard professional practice model. The research consisted of interviewing design professionals who utilize mock-ups as part of their standard of practice. These interviews were centered around two groups - those working in shipbuilding, where mock-ups have a long history, and those working in land-based projects, where mock-up use is rare. Analysis of the interviews indicated a positive relationship between mock-up use and collaboration, innovation, and problem solving. The interviews also brought to light concerns on behalf of all the professionals surveyed about the current practice model in land-based building design and construction projects within the United States. The positive relationships shown in the thesis support further research to explore how mock-ups can be best utilized in interior design.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A recently developed novel biomass fuel pellet, the Q’ Pellet, offers significant improvements over conventional white pellets, with characteristics comparable to those of coal. The Q’ Pellet was initially created at bench scale using a proprietary die and punch design, in which the biomass was torrefied in-situ¬ and then compressed. To bring the benefits of the Q’ Pellet to a commercial level, it must be capable of being produced in a continuous process at a competitive cost. A prototype machine was previously constructed in a first effort to assess continuous processing of the Q’ Pellet. The prototype torrefied biomass in a separate, ex-situ reactor and transported it into a rotary compression stage. Upon evaluation, parts of the prototype were found to be unsuccessful and required a redesign of the material transport method as well as the compression mechanism. A process was developed in which material was torrefied ex-situ and extruded in a pre-compression stage. The extruded biomass overcame multiple handling issues that had been experienced with un-densified biomass, facilitating efficient material transport. Biomass was extruded directly into a novel re-designed pelletizing die, which incorporated a removable cap, ejection pin and a die spring to accommodate a repeatable continuous process. Although after several uses the die required manual intervention due to minor design and manufacturing quality limitations, the system clearly demonstrated the capability of producing the Q’ Pellet in a continuous process. Q’ Pellets produced by the pre-compression method and pelletized in the re-designed die had an average dry basis gross calorific value of 22.04 MJ/kg, pellet durability index of 99.86% and dried to 6.2% of its initial mass following 24 hours submerged in water. This compares well with literature results of 21.29 MJ/kg, 100% pellet durability index and <5% mass increase in a water submersion test. These results indicate that the methods developed herein are capable of producing Q’ Pellets in a continuous process with fuel properties competitive with coal.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis presents details of the design and development of novel tools and instruments for scanning tunneling microscopy (STM), and may be considered as a repository for several years' worth of development work. The author presents design goals and implementations for two microscopes. First, a novel Pan-type STM was built that could be operated in an ambient environment as a liquid-phase STM. Unique features of this microscope include a unibody frame, for increased microscope rigidity, a novel slider component with large Z-range, a unique wiring scheme and damping mechanism, and a removable liquid cell. The microscope exhibits a high level of mechanical isolation at the tunnel junction, and operates excellently as an ambient tool. Experiments in liquid are on-going. Simultaneously, the author worked on designs for a novel low temperature, ultra-high vacuum (LT-UHV) instrument, and these are presented as well. A novel stick-slip vertical coarse approach motor was designed and built. To gauge the performance of the motor, an in situ motion sensing apparatus was implemented, which could measure the step size of the motor to high precision. A new driving circuit for stick-slip inertial motors is also presented, that o ffers improved performance over our previous driving circuit, at a fraction of the cost. The circuit was shown to increase step size performance by 25%. Finally, a horizontal sample stage was implemented in this microscope. The build of this UHV instrument is currently being fi nalized. In conjunction with the above design projects, the author was involved in a collaborative project characterizing N-heterocyclic carbene (NHC) self-assembled monolayers (SAMs) on Au(111) films. STM was used to characterize Au substrate quality, for both commercial substrates and those manufactured via a unique atomic layer deposition (ALD) process by collaborators. Ambient and UHV STM was then also used to characterize the NHC/Au(111) films themselves, and several key properties of these films are discussed. During this study, the author discovered an unexpected surface contaminant, and details of this are also presented. Finally, two models are presented for the nature of the NHC-Au(111) surface interaction based on the observed film properties, and some preliminary theoretical work by collaborators is presented.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

When plastic pipe is solidified, it proceeds through a long cooling chamber. Inside this chamber, inside the hollow extrudate, the plastic is molten, and this inner surface solidifies last. Sag, the flow due to the self-weight of the molten plastic, then happens in this cooling chamber, and sometimes, thickened regions (called knuckles) arise in the lower quadrants, especially of large diameter thickwalled pipes. To compensate for sag, engineers normally shift the die centerpiece downward. This thesis focuses on the consequences of this decentering. Specifically, when the molten polymer is viscoelastic, as is normally the case, a downward lateral force is exerted on the mandrel. Die eccentricity also affects the downstream axial force on the mandrel. These forces govern how rigidly the mandrel must be attached (normally, on a spider die). We attack this flow problem in eccentric cylindrical coordinates, using the Oldroyd 8-constant constitutive model framework. Specifically, we revise the method of Jones (1964), called polymer process partitioning. We estimate both axial and lateral forces. We develop a corresponding map to help plastics engineers predict the extrudate shape, including extrudate knuckles. From the mass balance over the postdie region, we then predict the shape of the extrudate entering the cooling chamber. We further include expressions for the stresses in the extruded polymer melt. We include detailed dimensional worked examples to show process engineers how to use our results to design pipe dies, and especially to suppress extrudate knuckling.