996 resultados para Uniform Commercial Code
Resumo:
Article
Resumo:
Summary - Cooking banana is one of the most important crops in Uganda; it is a staple food and source of household income in rural areas. The most common cooking banana is locally called matooke, a Musa sp triploid acuminate genome group (AAA-EAHB). It is perishable and traded in fresh form leading to very high postharvest losses (22-45%). This is attributed to: non-uniform level of harvest maturity, poor handling, bulk transportation and lack of value addition/processing technologies, which are currently the main challenges for trade and export, and diversified utilization of matooke. Drying is one of the oldest technologies employed in processing of agricultural produce. A lot of research has been carried out on drying of fruits and vegetables, but little information is available on matooke. Drying of matooke and milling it to flour extends its shelf-life is an important means to overcome the above challenges. Raw matooke flour is a generic flour developed to improve shelf stability of the fruit and to find alternative uses. It is rich in starch (80 - 85%db) and subsequently has a high potential as a calorie resource base. It possesses good properties for both food and non-food industrial use. Some effort has been done to commercialize the processing of matooke but there is still limited information on its processing into flour. It was imperative to carry out an in-depth study to bridge the following gaps: lack of accurate information on the maturity window within which matooke for processing into flour can be harvested leading to non-uniform quality of matooke flour; there is no information on moisture sorption isotherm for matooke from which the minimum equilibrium moisture content in relation to temperature and relative humidity is obtainable, below which the dry matooke would be microbiologically shelf-stable; and lack of information on drying behavior of matooke and standardized processing parameters for matooke in relation to physicochemical properties of the flour. The main objective of the study was to establish the optimum harvest maturity window and optimize the processing parameters for obtaining standardized microbiologically shelf-stable matooke flour with good starch quality attributes. This research was designed to: i) establish the optimum maturity harvest window within which matooke can be harvested to produce a consistent quality of matooke flour, ii) establish the sorption isotherms for matooke, iii) establish the effect of process parameters on drying characteristics of matooke, iv) optimize the drying process parameters for matooke, v) validate the models of maturity and optimum process parameters and vi) standardize process parameters for commercial processing of matooke. Samples were obtained from a banana plantation at Presidential Initiative on Banana Industrial Development (PIBID), Technology Business Incubation Center (TBI) at Nyaruzunga – Bushenyi in Western Uganda. A completely randomized design (CRD) was employed in selecting the banana stools from which samples for the experiments were picked. The cultivar Mbwazirume which is soft cooking and commonly grown in Bushenyi was selected for the study. The static gravitation method recommended by COST 90 Project (Wolf et al., 1985), was used for determination of moisture sorption isotherms. A research dryer developed for this research. All experiments were carried out in laboratories at TBI. The physiological maturity of matooke cv. mbwazirume at Bushenyi is 21 weeks. The optimum harvest maturity window for commercial processing of matooke flour (Raw Tooke Flour - RTF) at Bushenyi is between 15-21 weeks. The finger weight model is recommended for farmers to estimate harvest maturity for matooke and the combined model of finger weight and pulp peel ratio is recommended for commercial processors. Matooke isotherms exhibited type II curve behavior which is characteristic of foodstuffs. The GAB model best described all the adsorption and desorption moisture isotherms. For commercial processing of matooke, in order to obtain a microbiologically shelf-stable dry product. It is recommended to dry it to moisture content below or equal to 10% (wb). The hysteresis phenomenon was exhibited by the moisture sorption isotherms for matooke. The isoteric heat of sorption for both adsorptions and desorption isotherms increased with decreased moisture content. The total isosteric heat of sorption for matooke: adsorption isotherm ranged from 4,586 – 2,386 kJ/kg and desorption isotherm from 18,194– 2,391 kJ/kg for equilibrium moisture content from 0.3 – 0.01 (db) respectively. The minimum energy required for drying matooke from 80 – 10% (wb) is 8,124 kJ/kg of water removed. Implying that the minimum energy required for drying of 1 kg of fresh matooke from 80 - 10% (wb) is 5,793 kJ. The drying of matooke takes place in three steps: the warm-up and the two falling rate periods. The drying rate constant for all processing parameters ranged from 5,793 kJ and effective diffusivity ranged from 1.5E-10 - 8.27E-10 m2/s. The activation energy (Ea) for matooke was 16.3kJ/mol (1,605 kJ/kg). Comparing the activation energy (Ea) with the net isosteric heat of sorption for desorption isotherm (qst) (1,297.62) at 0.1 (kg water/kg dry matter), indicated that Ea was higher than qst suggesting that moisture molecules travel in liquid form in matooke slices. The total color difference (ΔE*) between the fresh and dry samples, was lowest for effect of thickness of 7 mm, followed by air velocity of 6 m/s, and then drying air temperature at 70˚C. The drying system controlled by set surface product temperature, reduced the drying time by 50% compared to that of a drying system controlled by set air drying temperature. The processing parameters did not have a significant effect on physicochemical and quality attributes, suggesting that any drying air temperature can be used in the initial stages of drying as long as the product temperature does not exceed gelatinization temperature of matooke (72˚C). The optimum processing parameters for single-layer drying of matooke are: thickness = 3 mm, air temperatures 70˚C, dew point temperature 18˚C and air velocity 6 m/s overflow mode. From practical point of view it is recommended that for commercial processing of matooke, to employ multi-layer drying of loading capacity equal or less than 7 kg/m², thickness 3 mm, air temperatures 70˚C, dew point temperature 18˚C and air velocity 6 m/s overflow mode.
Resumo:
The furious pace of Moore's Law is driving computer architecture into a realm where the the speed of light is the dominant factor in system latencies. The number of clock cycles to span a chip are increasing, while the number of bits that can be accessed within a clock cycle is decreasing. Hence, it is becoming more difficult to hide latency. One alternative solution is to reduce latency by migrating threads and data, but the overhead of existing implementations has previously made migration an unserviceable solution so far. I present an architecture, implementation, and mechanisms that reduces the overhead of migration to the point where migration is a viable supplement to other latency hiding mechanisms, such as multithreading. The architecture is abstract, and presents programmers with a simple, uniform fine-grained multithreaded parallel programming model with implicit memory management. In other words, the spatial nature and implementation details (such as the number of processors) of a parallel machine are entirely hidden from the programmer. Compiler writers are encouraged to devise programming languages for the machine that guide a programmer to express their ideas in terms of objects, since objects exhibit an inherent physical locality of data and code. The machine implementation can then leverage this locality to automatically distribute data and threads across the physical machine by using a set of high performance migration mechanisms. An implementation of this architecture could migrate a null thread in 66 cycles -- over a factor of 1000 improvement over previous work. Performance also scales well; the time required to move a typical thread is only 4 to 5 times that of a null thread. Data migration performance is similar, and scales linearly with data block size. Since the performance of the migration mechanism is on par with that of an L2 cache, the implementation simulated in my work has no data caches and relies instead on multithreading and the migration mechanism to hide and reduce access latencies.
Resumo:
UK commercial property lease structures have come under considerable scrutiny during the past decade since the property crash of the early 1990s. In particular, tenants complained that the system was unfair and that it has blocked business change. Government is committed, through its 2001 election manifesto, to promote flexibility and choice in the commercial property lettings market and a new voluntary Commercial Leases Code of Practice was launched in April 2002. This paper investigates whether occupiers are being offered the leases they require or whether there is a mismatch between occupier requirements and actual leases in the market. It draws together the substantial data now available on the actual terms of leases in the UK and surveys of corporate occupiers' attitude to their occupation requirements. Although the data indicated that UK leases have become shorter and more diverse since 1990, this is still not sufficient to meet the current requirements of many corporate occupiers. It is clear that the inability to manage entry and exit strategies is a major concern to occupiers. Lease length is the primary concern of tenants and a number of respondents comment on the mismatch between lease length in the UK and business planning horizons. The right to break and other problems with alienation clauses also pose serious difficulties for occupiers, thus reinforcing the mismatch. Other issues include repairing and insuring clauses and the type of review clause. There are differences in opinion between types of occupier. In particular, international corporate occupiers are significantly more concerned about the length of lease and the incidence of break clauses than national occupiers and private-sector tenants are significantly more concerned about leasing in general than public-sector occupiers. Proposed solutions by tenants are predictable and include shorter leases, more frequent breaks and relaxation of restrictions concerning alienation and other clauses. A significant number specify that they would pay more for shorter leases and other improved terms. Short leases would make many of the other terms more acceptable and this is why they are the main concern of corporate occupiers. Overall, the evidence suggests that there continues to be a gap between occupiers' lease requirements and those currently offered by the market. There are underlying structural factors that act as an inertial force on landlords and inhibit the changes which occupiers appear to want. Nevertheless, the findings raise future research questions concerning whether UK lease structures are a constraining factor on UK competitiveness.
Resumo:
The terms of a commercial property lease covers aspects such as rent, alterations to premises and the ability to leave; consequently they have a significant impact on cash flow and the ability of a business to develop. In contrast to the heavily-legislated residential sector, commercial landlords and tenants in the UK are largely free to negotiate the terms of their contract. Yet, since the property crash of 1989/90, successive governments have taken an interest in commercial leasing; in particular there is a desire to see landlords being more flexible. UK Government policy in this area has been pursued through industry self-regulation rather than legislation; since 1995 there have been three industry codes of practice on leasing. These codes are sanctioned by government and monitored by them. Yet, 15 years after the first code was launched, many in the industry see the whole code concept as ineffective and unlikely to ever achieve changes to certain aspects of landlord behaviour. This paper is the first step in considering the lease codes in the wider context of industry self-regulation. The aim of the paper is twofold: First a framework is created using the literature on industry self-regulation from various countries and industries which suggests key criteria to explain the effectiveness (or ineffectiveness) of self-regulation. This is then applied to the UK lease codes based on research carried out by the authors for the UK Government to monitor the success of all three codes. The outcome is a clearer understanding of the possibilities and limitations of using a voluntary solution to achieve policy aims within the property industry.
Resumo:
The UK government has sought to make changes to commercial property leasing practices. This has been the case since the recession of the 1990s. Industry self-regulation using an industry code of practice has been the vehicle for these changes. However, the code has had little direct success in changing practices. This is despite repeated threats of legislation as a constant backdrop to this initiative. The focus for this research is on the role of the industry bodies in the code initiative. They have been central to self-regulation in commercial leasing. Thus, the aim is to investigate the role of industry bodies in the process of institutional change. The context is industry self-regulation. The specific setting is commercial leasing. The main industry bodies in focus are the British Property Federation and Royal Institution of Chartered Surveyors. An existing model of institutional change forms the framework for the research. A chronological narrative is constructed from secondary data. This is analysed, identifying the actions of the industry bodies within the conceptual stages of the model. The analysis shows that the industry bodies had not acted as convincing agents of change for commercial leasing. In particular there was a lack of theorisation, a key stage in the process. The industry bodies did not develop a framework necessary to guide their members through the change process. These shortcomings of the industry bodies are likely to have contributed to the failure of the Code. However, the main conclusion is that, if industry self-regulation is led by government, then the state must work with industry bodies to harness their potential as champions and drivers of institutional change. This is particularly important in achieving change in institutionalised environments.
Resumo:
Toothpastes usually contain detergents, humectants, water colorant, fluoride and thickeners (e.g. silica). Tooth wear has a multi-factorial etilology and the use of abrasive dentifrices is related to abrasion of dental tissues during toothbrushing. This study evaluated in vitro the abrasiveness of a commercial silica gel low-abrasive dentrifice compared to an experimental dentifrice containing vegetable (almond) oil. Distilled water served as a control group. Acrylic specimens (8 per group) were submitted to simulated toothbrushing with slurries of the commercial dentifrice experimental dentifrice, almond oil and water in an automatic brushing machine programmed to 30,000 brush strokes for each specimen which is equivalent to 2 years of manual toothbrushing. Thereafter, surface roughness (Ra) of the specimens was analyzed with a Surfcorder SE 1700 profilometer. Data were analyzed statistically by ANOVA and Tukey's test at 5% significance level. There was no statistically significant differences (p>0.05) in the surface roughness after brushing with water almond oil experimental dentifrice. The commercial dentifrice produced rougher surfaces compared to the control and abrasive free products (p<0.05). Further studies are necessary in confirm the potential benefits of using vegetable oil in toothpaste as an alternative in abrasives in an attempt to minimize the tooth wear caused by toothbrushing.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
The accuracy of simulating the aerodynamics and structural properties of the blades is crucial in the wind-turbine technology. Hence the models used to implement these features need to be very precise and their level of detailing needs to be high. With the variety of blade designs being developed the models should be versatile enough to adapt to the changes required by every design. We are going to implement a combination of numerical models which are associated with the structural and the aerodynamic part of the simulation using the computational power of a parallel HPC cluster. The structural part models the heterogeneous internal structure of the beam based on a novel implementation of the Generalized Timoshenko Beam Model Technique.. Using this technique the 3-D structure of the blade is reduced into a 1-D beam which is asymptotically equivalent. This reduces the computational cost of the model without compromising its accuracy. This structural model interacts with the Flow model which is a modified version of the Blade Element Momentum Theory. The modified version of the BEM accounts for the large deflections of the blade and also considers the pre-defined structure of the blade. The coning, sweeping of the blade, tilt of the nacelle and the twist of the sections along the blade length are all computed by the model which aren’t considered in the classical BEM theory. Each of these two models provides feedback to the other and the interactive computations lead to more accurate outputs. We successfully implemented the computational models to analyze and simulate the structural and aerodynamic aspects of the blades. The interactive nature of these models and their ability to recompute data using the feedback from each other makes this code more efficient than the commercial codes available. In this thesis we start off with the verification of these models by testing it on the well-known benchmark blade for the NREL-5MW Reference Wind Turbine, an alternative fixed-speed stall-controlled blade design proposed by Delft University, and a novel alternative design that we proposed for a variable-speed stall-controlled turbine, which offers the potential for more uniform power control and improved annual energy production.. To optimize the power output of the stall-controlled blade we modify the existing designs and study their behavior using the aforementioned aero elastic model.
Resumo:
There is a growing interest in the location of Treatment, Storage, and Disposal (TSDF) sites in relation to minority communities. A number of studies have been completed, and the results of these studies have been varied. Some of the studies have shown a strong positive correlation between the location of TSDF sites and minority populations, while a few have shown no significance in that relationship. The major difference between these studies has been in the areal unit used.^ This study compared the minority populations of Texas census tracts and ZIP codes containing a TSDF using the associated county as the comparison population. The hypothesis of this study was that there was no difference between using census tracts and ZIP codes to analyze the relationship of minority populations and TSDF's. The census data used was from 1990, and the initial list of TSDF sites was supplied by the Texas Natural Resource Conservation Commission. The TSDF site locations were checked using graphical information systems (GIS) programs, in order to increase the accuracy of the identity of exposed ZIP codes and census tracts. The minority populations of the exposed areal units were compared using proportional differences, crosstables, maps, and logistic regression. The dependent variable used was the exposure status of the areal units under study, including counties, census tracts, and ZIP codes. The independent variables used included minority group proportion and grouping of the proportions, educational status, household income, and home value.^ In all cases, education was significant or near significant at the.05 level. Education rather than minority proportion was therefore the most significant predictor of the exposure status of a census tract or ZIP code. ^
Resumo:
Virtual worlds have moved from being a geek topic to one of mainstream academic interest. This transition is contingent not only on the augmented economic, societal and cultural value of these virtual realities and their effect upon real life but also on their convenience as fields for experimentation, for testing models and paradigms. User creation is however not something that has been transplanted from the real to the virtual world but a phenomenon and a dynamic process that happens from within and is defined through complex relationships between commercial and non-commercial, commodified and not commodified, individual and of the community, amateur and professional, art and not art. Accounting for this complex environment, the present paper explores user created content in virtual worlds, its dimensions and value and above all, its constraints by code and law. It puts forward suggestions for better understanding and harnessing this creativity.
Resumo:
Digital technologies have profoundly changed not only the ways we create, distribute, access, use and re-use information but also many of the governance structures we had in place. Overall, "older" institutions at all governance levels have grappled and often failed to master the multi-faceted and multi-directional issues of the Internet. Regulatory entrepreneurs have yet to discover and fully mobilize the potential of digital technologies as an influential factor impacting upon the regulability of the environment and as a potential regulatory tool in themselves. At the same time, we have seen a deterioration of some public spaces and lower prioritization of public objectives, when strong private commercial interests are at play, such as most tellingly in the field of copyright. Less tangibly, private ordering has taken hold and captured through contracts spaces, previously regulated by public law. Code embedded in technology often replaces law. Non-state action has in general proliferated and put serious pressure upon conventional state-centered, command-and-control models. Under the conditions of this "messy" governance, the provision of key public goods, such as freedom of information, has been made difficult or is indeed jeopardized.The grand question is how can we navigate this complex multi-actor, multi-issue space and secure the attainment of fundamental public interest objectives. This is also the question that Ian Brown and Chris Marsden seek to answer with their book, Regulating Code, as recently published under the "Information Revolution and Global Politics" series of MIT Press. This book review critically assesses the bold effort by Brown and Marsden.