875 resultados para HIGH-QUALITY-FACTOR


Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present a single stage direct fs ablation results which show that it is possible to make high quality and high aspect ratio devices in a single stage process using a CAD optimised approach. © 2008 Optical Society of America.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Most pavement design procedures incorporate reliability to account for design inputs-associated uncertainty and variability effect on predicted performance. The load and resistance factor design (LRFD) procedure, which delivers economical section while considering design inputs variability separately, has been recognised as an effective tool to incorporate reliability into design procedures. This paper presents a new reliability-based calibration in LRFD format for a mechanics-based fatigue cracking analysis framework. This paper employs a two-component reliability analysis methodology that utilises a central composite design-based response surface approach and a first-order reliability method. The reliability calibration was achieved based on a number of field pavement sections that have well-documented performance history and high-quality field and laboratory data. The effectiveness of the developed LRFD procedure was evaluated by performing pavement designs of various target reliabilities and design conditions. The result shows an excellent agreement between the target and actual reliabilities. Furthermore, it is clear from the results that more design features need to be included in the reliability calibration to minimise the deviation of the actual reliability from the target reliability.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The objectives of the experiment were to assess the impact of nitrogen (N) and potassium (K) fertiliser application on the cell wall composition and fast-pyrolysis conversion quality of the commercially cultivated hybrid Miscanthus x giganteus. Five different fertiliser treatments were applied to mature Miscanthus plants which were sampled at five intervals over a growing season. The different fertiliser treatments produced significant variation in concentrations of cell wall components and ash within the biomass and affected the composition and quality of the resulting fast-pyrolysis liquids. The results indicated that application of high rates of N fertiliser had a negative effect on feedstock quality for this conversion pathway: reducing the proportion of cell wall components and increasing accumulation of ash in the harvested biomass. No exclusive effect of potassium fertiliser was observed. The low-N fertiliser treatment produced high quality, low ash-high lignin biomass most suitable as a feedstock for thermo-chemical conversion. © 2010.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

3D geographic information system (GIS) is data and computation intensive in nature. Internet users are usually equipped with low-end personal computers and network connections of limited bandwidth. Data reduction and performance optimization techniques are of critical importance in quality of service (QoS) management for online 3D GIS. In this research, QoS management issues regarding distributed 3D GIS presentation were studied to develop 3D TerraFly, an interactive 3D GIS that supports high quality online terrain visualization and navigation. ^ To tackle the QoS management challenges, multi-resolution rendering model, adaptive level of detail (LOD) control and mesh simplification algorithms were proposed to effectively reduce the terrain model complexity. The rendering model is adaptively decomposed into sub-regions of up-to-three detail levels according to viewing distance and other dynamic quality measurements. The mesh simplification algorithm was designed as a hybrid algorithm that combines edge straightening and quad-tree compression to reduce the mesh complexity by removing geometrically redundant vertices. The main advantage of this mesh simplification algorithm is that grid mesh can be directly processed in parallel without triangulation overhead. Algorithms facilitating remote accessing and distributed processing of volumetric GIS data, such as data replication, directory service, request scheduling, predictive data retrieving and caching were also proposed. ^ A prototype of the proposed 3D TerraFly implemented in this research demonstrates the effectiveness of our proposed QoS management framework in handling interactive online 3D GIS. The system implementation details and future directions of this research are also addressed in this thesis. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The purpose of this study was to design a preventive scheme using directional antennas to improve the performance of mobile ad hoc networks. In this dissertation, a novel Directionality based Preventive Link Maintenance (DPLM) Scheme is proposed to characterize the performance gain [JaY06a, JaY06b, JCY06] by extending the life of link. In order to maintain the link and take preventive action, signal strength of data packets is measured. Moreover, location information or angle of arrival information is collected during communication and saved in the table. When measured signal strength is below orientation threshold , an orientation warning is generated towards the previous hop node. Once orientation warning is received by previous hop (adjacent) node, it verifies the correctness of orientation warning with few hello pings and initiates high quality directional link (a link above the threshold) and immediately switches to it, avoiding a link break altogether. The location information is utilized to create a directional link by orienting neighboring nodes antennas towards each other. We call this operation an orientation handoff, which is similar to soft-handoff in cellular networks. ^ Signal strength is the indicating factor, which represents the health of the link and helps to predict the link failure. In other words, link breakage happens due to node movement and subsequently reducing signal strength of receiving packets. DPLM scheme helps ad hoc networks to avoid or postpone costly operation of route rediscovery in on-demand routing protocols by taking above-mentioned preventive action. ^ This dissertation advocates close but simple collaboration between the routing, medium access control and physical layers. In order to extend the link, the Dynamic Source Routing (DSR) and IEEE 802.11 MAC protocols were modified to use the ability of directional antennas to transmit over longer distance. A directional antenna module is implemented in OPNET simulator with two separate modes of operations: omnidirectional and directional. The antenna module has been incorporated in wireless node model and simulations are performed to characterize the performance improvement of mobile ad hoc networks. Extensive simulations have shown that without affecting the behavior of the routing protocol noticeably, aggregate throughput, packet delivery ratio, end-to-end delay (latency), routing overhead, number of data packets dropped, and number of path breaks are improved considerably. We have done the analysis of the results in different scenarios to evaluate that the use of directional antennas with proposed DPLM scheme has been found promising to improve the performance of mobile ad hoc networks. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This dissertation examines the quality of hazard mitigation elements in a coastal, hazard prone state. I answer two questions. First, in a state with a strong mandate for hazard mitigation elements in comprehensive plans, does plan quality differ among county governments? Second, if such variation exists, what drives this variation? My research focuses primarily on Florida's 35 coastal counties, which are all at risk for hurricane and flood hazards, and all fall under Florida's mandate to have a comprehensive plan that includes a hazard mitigation element. Research methods included document review to rate the hazard mitigation elements of all 35 coastal county plans and subsequent analysis against demographic and hazard history factors. Following this, I conducted an electronic, nationwide survey of planning professionals and academics, informed by interviews of planning leaders in Florida counties. I found that hazard mitigation element quality varied widely among the 35 Florida coastal counties, but were close to a normal distribution. No plans were of exceptionally high quality. Overall, historical hazard effects did not correlate with hazard mitigation element quality, but some demographic variables that are associated with urban populations did. The variance in hazard mitigation element quality indicates that while state law may mandate, and even prescribe, hazard mitigation in local comprehensive plans, not all plans will result in equal, or even adequate, protection for people. Furthermore, the mixed correlations with demographic variables representing social and disaster vulnerability shows that, at least at the county level, vulnerability to hazards does not have a strong effect on hazard mitigation element quality. From a theory perspective, my research is significant because it compares assumptions about vulnerability based on hazard history and demographics to plan quality. The only vulnerability-related variables that appeared to correlate, and at that mildly so, with hazard mitigation element quality, were those typically representing more urban areas. In terms of the theory of Neo-Institutionalism and theories related to learning organizations, my research shows that planning departments appear to have set norms and rules of operating that preclude both significant public involvement and learning from prior hazard events.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In an article entitled - The Specialist: Coming Soon To Your Local Hotel - by Stan Bromley, Regional Vice President and General Manager, Four Seasons Clift Hotel, San Francisco, the author’s introduction states: “An experienced hotelier discusses the importance of the delivery of a high “quality-to-value” ratio consistently to guests, particularly as the hotel market becomes specialized and a distinction is drawn between a “property” and a “hotel.” The author’s primary intention is to make you, the reader, aware of changes in the hospitality/hotel marketplace. From the embryo to the contemporary, the hotel market has consistently evolved; this includes but is not limited to mission statement, marketing, management, facilities, and all the tangibles and intangibles of the total hotel experience. “Although we are knocking ourselves out trying to be everything to everyone, I don't think hotel consumers are as interested in “mixing and matching” as they were in the past,” Bromley says. “Today's hotel guest is looking for “specialized care,” and is increasingly skeptical of our industry-wide hotel ads and promises of greatness.” As an example Bromley makes an analogy using retail outlets such as Macy’s, Saks, and Sears, which cater to their own unique market segment. Hotels now follow the same outline, he allows. “In my view, two key factors will make a hotel a success,” advises Bromley. “First, know your specialty and market to that segment. Second, make sure you consistently offer a high quality-to-value ratio. That means every day.” To emphasize that second point, Bromley offers this bolstering thought, “The second factor that will make or break your business is your ability to deliver a high "quality/value" ratio-and to do so consistently.” The author evidently considers quality-to-value ratio to be an important element. Bromley emphasizes the importance of convention and trade show business to the hotel industry. That business element cannot be over-estimated in his opinion. This doesn’t mean an operator who can accommodate that type of business should exclude other client opportunities outside the target market. It does mean, however, these secondary opportunities should only be addressed after pursuing the primary target strategy. After all, the largest profit margin lies in the center of the target. To amplify the above statement, and in reference to his own experience, Bromley says, “Being in the luxury end of the business I, on the other hand, need to uncover and book individuals and small corporate meetings more than convention or association business.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The tropical echinoid Echinometra viridis was reared in controlled laboratory experiments at temperatures of approximately 20°C and 30°C to mimic winter and summer temperatures and at carbon dioxide (CO2) partial pressures of approximately 487 ppm-v and 805 ppm-v to simulate current and predicted-end-of-century levels. Spine material produced during the experimental period and dissolved inorganic carbon (DIC) of the corresponding culture solutions were then analyzed for stable oxygen (delta 18Oe, delta 18ODIC) and carbon (The tropical echinoid Echinometra viridis was reared in controlled laboratory experiments at temperatures of approximately 20°C and 30°C to mimic winter and summer temperatures and at carbon dioxide (CO2) partial pressures of approximately 487 ppm-v and 805 ppm-v to simulate current and predicted-end-of-century levels. Spine material produced during the experimental period and dissolved inorganic carbon (DIC) of the corresponding culture solutions were then analyzed for stable oxygen (delta18Oe, delta18ODIC) and carbon (delta13Ce, delta13CDIC) isotopic composition. Fractionation of oxygen stable isotopes between the echinoid spines and DIC of their corresponding culture solutions (delta18O = delta18Oe - delta18ODIC) was significantly inversely correlated with seawater temperature but not significantly correlated with atmospheric pCO2. Fractionation of carbon stable isotopes between the echinoid spines and DIC of their corresponding culture solutions (Delta delta13C = delta13Ce - delta13CDIC) was significantly positively correlated with pCO2 and significantly inversely correlated with temperature, with pCO2 functioning as the primary factor and temperature moderating the pCO2-delta13C relationship. Echinoid calcification rate was significantly inversely correlated with both delta18O and delta13C, both within treatments (i.e., pCO2 and temperature fixed) and across treatments (i.e., with effects of pCO2 and temperature controlled for through ANOVA). Therefore, calcification rate and potentially the rate of co-occurring dissolution appear to be important drivers of the kinetic isotope effects observed in the echinoid spines. Study results suggest that echinoid delta18O monitors seawater temperature, but not atmospheric pCO2, and that echinoid delta13C monitors atmospheric pCO2, with temperature moderating this relationship. These findings, coupled with echinoids' long and generally high-quality fossil record, supports prior assertions that fossil echinoid delta18O is a viable archive of paleo-seawater temperature throughout Phanerozoic time, and that delta13C merits further investigation as a potential proxy of paleo-atmospheric pCO2. However, the apparent impact of calcification rate on echinoid delta18O and delta13C suggests that paleoceanographic reconstructions derived from these proxies in fossil echinoids could be improved by incorporating the effects of growth rate.13Ce, delta13CDIC) isotopic composition. Fractionation of oxygen stable isotopes between the echinoid spines and DIC of their corresponding culture solutions (delta18O = delta18Oe - delta18ODIC) was significantly inversely correlated with seawater temperature but not significantly correlated with atmospheric pCO2. Fractionation of carbon stable isotopes between the echinoid spines and DIC of their corresponding culture solutions (delta13C = delta13Ce - delta13CDIC) was significantly positively correlated with pCO2 and significantly inversely correlated with temperature, with pCO2 functioning as the primary factor and temperature moderating the pCO2-delta13C relationship. Echinoid calcification rate was significantly inversely correlated with both delta18O and delta13C, both within treatments (i.e., pCO2 and temperature fixed) and across treatments (i.e., with effects of pCO2 and temperature controlled for through ANOVA). Therefore, calcification rate and potentially the rate of co-occurring dissolution appear to be important drivers of the kinetic isotope effects observed in the echinoid spines. Study results suggest that echinoid delta18O monitors seawater temperature, but not atmospheric pCO2, and that echinoid delta13C monitors atmospheric pCO2, with temperature moderating this relationship. These findings, coupled with echinoids' long and generally high-quality fossil record, supports prior assertions that fossil echinoid delta18O is a viable archive of paleo-seawater temperature throughout Phanerozoic time, and that delta13C merits further investigation as a potential proxy of paleo-atmospheric pCO2. However, the apparent impact of calcification rate on echinoid delta18O and delta13C suggests that paleoceanographic reconstructions derived from these proxies in fossil echinoids could be improved by incorporating the effects of growth rate.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The evaluation of seed vigor is an important factor for detection of lots of high quality seeds, so that development of procedures to evaluate the physiological potential has been an important tool in quality control programs seeds. In this sense the study aimed to adapt the methodologies of accelerated aging, electrical conductivity and potassium leaching to evaluate Moringa oleifera seed vigor LAM.. Therefore, four lots of moringa seeds were subjected to the germination tests, seedling emergence, speed of emergence index, emergence first count, length and dry mass of seedlings and cold test for their physiological characterization, in addition to accelerated aging, electrical conductivity and potassium leaching. The experimental design was completely randomized with four replications of 50 seeds and the means compared by Tukey test at 5% probability. For accelerated aging the periods were studied aging 12, 24 and 72 hours at 40, 42 and 45°C. For the electrical conductivity test was used to a temperature of 25°C for periods of 4, 8, 12, 16 and 24 hours of immersion in 75 to 125 mL of distilled water, using 25 to 50 seeds, and for potassium leaching test samples were used 25 to 50 seeds, placed in plastic cups containing 70 and 100 mL of distilled water at 25°C for periods of 1, 2, 3, 4, 5 and 6 hours. From the results obtained, it can be inferred that the methods best fit for the accelerated aging test Moringa seeds were a temperature of 40°C for 12 to 72 hours, 42°C 72 hours 45°C 24 hours . In the electrical conductivity test Moringa seeds, the combination of 50 seeds in 75 mL distilled water for a period of immersion of 4 hours and 50 seeds in 125 mL of 4 hours were efficient for the differentiation of lots of Moringa seeds as to vigor and for potassium leaching test moringa seeds, the combination of 50 seeds in 100mL of distilled water allowed the separation of lots of four levels of vigor, at 2 hours of immersion, showing promise in evaluate the quality of moringa seeds.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The evaluation of seed vigor is an important factor for detection of lots of high quality seeds, so that development of procedures to evaluate the physiological potential has been an important tool in quality control programs seeds. In this sense the study aimed to adapt the methodologies of accelerated aging, electrical conductivity and potassium leaching to evaluate Moringa oleifera seed vigor LAM.. Therefore, four lots of moringa seeds were subjected to the germination tests, seedling emergence, speed of emergence index, emergence first count, length and dry mass of seedlings and cold test for their physiological characterization, in addition to accelerated aging, electrical conductivity and potassium leaching. The experimental design was completely randomized with four replications of 50 seeds and the means compared by Tukey test at 5% probability. For accelerated aging the periods were studied aging 12, 24 and 72 hours at 40, 42 and 45°C. For the electrical conductivity test was used to a temperature of 25°C for periods of 4, 8, 12, 16 and 24 hours of immersion in 75 to 125 mL of distilled water, using 25 to 50 seeds, and for potassium leaching test samples were used 25 to 50 seeds, placed in plastic cups containing 70 and 100 mL of distilled water at 25°C for periods of 1, 2, 3, 4, 5 and 6 hours. From the results obtained, it can be inferred that the methods best fit for the accelerated aging test Moringa seeds were a temperature of 40°C for 12 to 72 hours, 42°C 72 hours 45°C 24 hours . In the electrical conductivity test Moringa seeds, the combination of 50 seeds in 75 mL distilled water for a period of immersion of 4 hours and 50 seeds in 125 mL of 4 hours were efficient for the differentiation of lots of Moringa seeds as to vigor and for potassium leaching test moringa seeds, the combination of 50 seeds in 100mL of distilled water allowed the separation of lots of four levels of vigor, at 2 hours of immersion, showing promise in evaluate the quality of moringa seeds.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The purpose of using software based on numerical approximations for metal forming is given by the need to ensure process efficiency in order to get high quality products at lowest cost and shortest time. This study uses the theory of similitude in order to develop a technique capable of simulating the stamping process of a metal sheet, obtaining results close to the real values, with shorter processing times. The results are obtained through simulations performed in the finite element software STAMPACK®. This software uses the explicit integration method in time, which is usually applied to solve nonlinear problems involving contact, such as the metal forming processes. The technique was developed from a stamping model of a square box, simulated with four different scale factors, two higher and two smaller than the real scale. The technique was validated with a bending model of a welded plate, which had a high simulation time. The application of the technique allowed over 50% of decrease in the time of simulation. The results for the application of the scale technique for forming plates were satisfactory, showing good quantitative results related to the decrease of the total time of simulation. Finally, it is noted that the decrease in simulation time is only possible with the use of two related scales, the geometric and kinematic scale. The kinematic scale factors should be used with caution, because the high speeds can cause dynamic problems and could influence the results of the simulations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: It is unclear whether diagnostic protocols based on cardiac markers to identify low-risk chest pain patients suitable for early release from the emergency department can be applied to patients older than 65 years or with traditional cardiac risk factors. METHODS AND RESULTS: In a single-center retrospective study of 231 consecutive patients with high-risk factor burden in which a first cardiac troponin (cTn) level was measured in the emergency department and a second cTn sample was drawn 4 to 14 hours later, we compared the performance of a modified 2-Hour Accelerated Diagnostic Protocol to Assess Patients with Chest Pain Using Contemporary Troponins as the Only Biomarker (ADAPT) rule to a new risk classification scheme that identifies patients as low risk if they have no known coronary artery disease, a nonischemic electrocardiogram, and 2 cTn levels below the assay's limit of detection. Demographic and outcome data were abstracted through chart review. The median age of our population was 64 years, and 75% had Thrombosis In Myocardial Infarction risk score ≥2. Using our risk classification rule, 53 (23%) patients were low risk with a negative predictive value for 30-day cardiac events of 98%. Applying a modified ADAPT rule to our cohort, 18 (8%) patients were identified as low risk with a negative predictive value of 100%. In a sensitivity analysis, the negative predictive value of our risk algorithm did not change when we relied only on undetectable baseline cTn and eliminated the second cTn assessment. CONCLUSIONS: If confirmed in prospective studies, this less-restrictive risk classification strategy could be used to safely identify chest pain patients with more traditional cardiac risk factors for early emergency department release.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The global prevalence of obesity in the older adult population is growing, an increasing concern in both the developed and developing countries of the world. The study of geriatric obesity and its management is a relatively new area of research, especially pertaining to those with elevated health risks. This review characterizes the state of science for this "fat and frail" population and identifies the many gaps in knowledge where future study is urgently needed. In community dwelling older adults, opportunities to improve both body weight and nutritional status are hampered by inadequate programs to identify and treat obesity, but where support programs exist, there are proven benefits. Nutritional status of the hospitalized older adult should be optimized to overcome the stressors of chronic disease, acute illness, and/or surgery. The least restrictive diets tailored to individual preferences while meeting each patient's nutritional needs will facilitate the energy required for mobility, respiratory sufficiency, immunocompentence, and wound healing. Complications of care due to obesity in the nursing home setting, especially in those with advanced physical and mental disabilities, are becoming more ubiquitous; in almost all of these situations, weight stability is advocated, as some evidence links weight loss with increased mortality. High quality interdisciplinary studies in a variety of settings are needed to identify standards of care and effective treatments for the most vulnerable obese older adults.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The amount and quality of available biomass is a key factor for the sustainable livestock industry and agricultural management related decision making. Globally 31.5% of land cover is grassland while 80% of Ireland’s agricultural land is grassland. In Ireland, grasslands are intensively managed and provide the cheapest feed source for animals. This dissertation presents a detailed state of the art review of satellite remote sensing of grasslands, and the potential application of optical (Moderate–resolution Imaging Spectroradiometer (MODIS)) and radar (TerraSAR-X) time series imagery to estimate the grassland biomass at two study sites (Moorepark and Grange) in the Republic of Ireland using both statistical and state of the art machine learning algorithms. High quality weather data available from the on-site weather station was also used to calculate the Growing Degree Days (GDD) for Grange to determine the impact of ancillary data on biomass estimation. In situ and satellite data covering 12 years for the Moorepark and 6 years for the Grange study sites were used to predict grassland biomass using multiple linear regression, Neuro Fuzzy Inference Systems (ANFIS) models. The results demonstrate that a dense (8-day composite) MODIS image time series, along with high quality in situ data, can be used to retrieve grassland biomass with high performance (R2 = 0:86; p < 0:05, RMSE = 11.07 for Moorepark). The model for Grange was modified to evaluate the synergistic use of vegetation indices derived from remote sensing time series and accumulated GDD information. As GDD is strongly linked to the plant development, or phonological stage, an improvement in biomass estimation would be expected. It was observed that using the ANFIS model the biomass estimation accuracy increased from R2 = 0:76 (p < 0:05) to R2 = 0:81 (p < 0:05) and the root mean square error was reduced by 2.72%. The work on the application of optical remote sensing was further developed using a TerraSAR-X Staring Spotlight mode time series over the Moorepark study site to explore the extent to which very high resolution Synthetic Aperture Radar (SAR) data of interferometrically coherent paddocks can be exploited to retrieve grassland biophysical parameters. After filtering out the non-coherent plots it is demonstrated that interferometric coherence can be used to retrieve grassland biophysical parameters (i. e., height, biomass), and that it is possible to detect changes due to the grass growth, and grazing and mowing events, when the temporal baseline is short (11 days). However, it not possible to automatically uniquely identify the cause of these changes based only on the SAR backscatter and coherence, due to the ambiguity caused by tall grass laid down due to the wind. Overall, the work presented in this dissertation has demonstrated the potential of dense remote sensing and weather data time series to predict grassland biomass using machine-learning algorithms, where high quality ground data were used for training. At present a major limitation for national scale biomass retrieval is the lack of spatial and temporal ground samples, which can be partially resolved by minor modifications in the existing PastureBaseIreland database by adding the location and extent ofeach grassland paddock in the database. As far as remote sensing data requirements are concerned, MODIS is useful for large scale evaluation but due to its coarse resolution it is not possible to detect the variations within the fields and between the fields at the farm scale. However, this issue will be resolved in terms of spatial resolution by the Sentinel-2 mission, and when both satellites (Sentinel-2A and Sentinel-2B) are operational the revisit time will reduce to 5 days, which together with Landsat-8, should enable sufficient cloud-free data for operational biomass estimation at a national scale. The Synthetic Aperture Radar Interferometry (InSAR) approach is feasible if there are enough coherent interferometric pairs available, however this is difficult to achieve due to the temporal decorrelation of the signal. For repeat-pass InSAR over a vegetated area even an 11 days temporal baseline is too large. In order to achieve better coherence a very high resolution is required at the cost of spatial coverage, which limits its scope for use in an operational context at a national scale. Future InSAR missions with pair acquisition in Tandem mode will minimize the temporal decorrelation over vegetation areas for more focused studies. The proposed approach complements the current paradigm of Big Data in Earth Observation, and illustrates the feasibility of integrating data from multiple sources. In future, this framework can be used to build an operational decision support system for retrieval of grassland biophysical parameters based on data from long term planned optical missions (e. g., Landsat, Sentinel) that will ensure the continuity of data acquisition. Similarly, Spanish X-band PAZ and TerraSAR-X2 missions will ensure the continuity of TerraSAR-X and COSMO-SkyMed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Two concepts in rural economic development policy have been the focus of much research and policy action: the identification and support of clusters or networks of firms and the availability and adoption by rural businesses of Information and Communication Technologies (ICT). From a theoretical viewpoint these policies are based on two contrasting models, with clustering seen as a process of economic agglomeration, and ICT-mediated communication as a means of facilitating economic dispersion. The study’s conceptual framework is based on four interrelated elements: location, interaction, knowledge, and advantage, together with the concept of networks which is employed as an operationally and theoretically unifying concept. The research questions are developed in four successive categories: Policy, Theory, Networks, and Method. The questions are approached using a study of two contrasting groups of rural small businesses in West Cork, Ireland: (a) Speciality Foods, and (b) firms in Digital Products and Services. The study combines Social Network Analysis (SNA) with Qualitative Thematic Analysis, using data collected from semi-structured interviews with 58 owners or managers of these businesses. Data comprise relational network data on the firms’ connections to suppliers, customers, allies and competitors, together with linked qualitative data on how the firms established connections, and how tacit and codified knowledge was sourced and utilised. The research finds that the key characteristics identified in the cluster literature are evident in the sample of Speciality Food businesses, in relation to flows of tacit knowledge, social embedding, and the development of forms of social capital. In particular the research identified the presence of two distinct forms of collective social capital in this network, termed “community” and “reputation”. By contrast the sample of Digital Products and Services businesses does not have the form of a cluster, but matches more closely to dispersive models, or “chain” structures. Much of the economic and social structure of this set of firms is best explained in terms of “project organisation”, and by the operation of an individual rather than collective form of “reputation”. The rural setting in which these firms are located has resulted in their being service-centric, and consequently they rely on ICT-mediated communication in order to exchange tacit knowledge “at a distance”. It is this factor, rather than inputs of codified knowledge, that most strongly influences their operation and their need for availability and adoption of high quality communication technologies. Thus the findings have applicability in relation to theory in Economic Geography and to policy and practice in Rural Development. In addition the research contributes to methodological questions in SNA, and to methodological questions about the combination or mixing of quantitative and qualitative methods.