Archive

  • 2012 Volume 14 Issue 6
    Published: 25 December 2012
      

    ARTICLES
  • Select all
    |
    ARTICLES
  • ARTICLES
    CHEN Nai-Cheng, WANG Xiao-Lei, WANG Chao
    Download PDF ( )   Knowledge map   Save

    The integration of semantics in the context of the Sensor Web for Earth Observation can achieve a clear and detailed description and consistent expression of resources. Due to the large number of sensors and differing accompanying resources, the semantic technology plays a major role in providing resource discovery and automatic composition as well as machine interpretation in an interoperable way. This article summarizes the developments in the field and outlines the concepts, structure, and functions of the Semantic Sensor Web (SSW). Research on the Sensor Web and the Semantic Web has promoted the growth and rapid development of the SSW. The progress of SSW reflects coherent framework for accessing, sharing, discovering, and managing resources across different applications. Most efforts in this area have focused on the key technologies of the SSW, which could be used to construct sensor- and observation-centric typical ontologies. The integration of semantic technologies into the Sensor Web has proposed as important components of complex, heterogeneous and dynamic information systems. It improves the interoperation of heterogeneous resources and communication protocols, and has developed to address this aim of intelligent expression, discovery and services in the Sensor Web. The Sensor Web for Earth Observation lacks of synergism and cannot meet the diversity requirement of comprehensive and emergency monitoring. Using semantics could solve the problems and provide a large-scale observation, high-efficient distributed information fusion and real-time information service. We point out challenges in the context of the needs of ground-airborne-spaceborne integrated SSW to achieving event awareness, synergy observation, efficiently processing and focusing service. The challenges of SSW can be considered as an evolutionary advancement of interoperation on Sensor Web. Future work needs to focus on a wider and deeper scale to establish a new generation of ontology and framework for the Sensor Web.

  • ARTICLES
    ZHANG Jin
    Download PDF ( )   Knowledge map   Save

    The geosensor networks system of mine ground disaster monitoring is a integrating system implemented by comprehensive application of geomatics technology, that is, satellite remote sensing, geographic information systems, satellite positioning, Georobots, ground based SAR, and sensor networks. It is of great practical significance to predict the mine ground disasters timely and accurately, in order to prevent and reduce the loss of mine ground deformation disaster. The main research contents of the geosensor networks system of mine ground disaster monitoring include data whole processing theory and method, geographic grid and function partition, data fusion, spatial data clustering analysis, disaster effect analysis, intelligent forecasting models and theory, spatial database and service platform system. Using time-series monitoring data of multiphase geosensor networks, the high-resolution satellite remote sensing monitoring data and the function partition data, the dynamic deformation field can be established over mining region based on the research of spatiotemporal variations, so as to evaluate the effectiveness of the dynamic deformation field by measured data and optimize the deformation field model. Finally we can develop the geosensor networks GIS system of mine ground disasters monitoring and realize the integrating management of monitoring data and analyzing for the purpose of mine safety production and ground deformation monitoring and forecasting.

  • ARTICLES
    SONG Hong-Quan, LIU Hua-Jun, LV Guo-Nian, ZHANG Xin-Guo
    Download PDF ( )   Knowledge map   Save

    With the rapid development of the social economy, the massive crowd gathering appears frequently. Personnel casualties often caused by higher crowd density. So, video surveillance technology has become a national policy in many countries. Surveillance cameras have been installed in various important places of the city. Real-time monitoring of the crowds status in crowd gathering area can provide important basis for crowd management and emergency warning. Existing video-based crowd analysis can only monitor crowd status for each camera separately. We cannot get the spatial-temporal patterns of regional crowd status from a spatial perspective. In this paper, we proposed a video-GIS framework for crowd analysis. Video frames can be mapped to geographic space based on the video-GIS framework. So we can process crowd images and extract crowd density, crowd movement vector field in GIS. Then the crowd movement pattern and the main direction of crowd movement can be acquired by the vector field analysis. Finally, we design and implement a real-time monitoring system for the regional crowd status using video surveillance system and GIS. Experimental results show that: (1) previous crowd analysis methods based on the image space can only measure results by the unit of pixels. It requires further conversion if we want to get the real value. But we can get the real value directly when we process crowd images in GIS using the method we proposed. (2) The accuracy of the pixel-based low-density crowd counting estimation results can be up to 90%. The classification accuracy of the high-density crowd levels support vector machine classifier is more than 95%. So, they can fully meet the needs of crowd monitoring. (3) We can get the crowd movement pattern and the main movement direction by the analysis of crowd movement vector field in GIS. Also, we can obtain the speed of the crowd in different directions. These crowd characters all can be expressed in GIS. (4) The system we developed for the crowd monitoring can be applied to crowd management and emergency warning. It can provide decision making basis for emergencies prevention and crowd divert.

  • ARTICLES
    LIN Na, SHU Chang-Jing, LIN Shu-Jing
    Download PDF ( )   Knowledge map   Save

    With the construction and development of the sites such as the national "MapWorld" recently attracted a growing interest from the scientific and industrial communities, and mainly due to the large number of possible applications capable to provide map service, a crucial issue is arising when dealing with tile map web service publication in copyright protection. According to the need of having a confident and secure communication of tile map in the network environment, a novel mapping mechanism based on watermarking algorithm for tile map is presented here, in such a way that the invisible and robust watermark could be embedded effectively. In the network environment, the geographic data is usually stored in the form of tiles at service terminals, while indexing mechanism is often used as the storage method in order to save the huge amount of tile map. However, the tile map stored by indexing mechanism has different watermarking technology requirements with the common image and raster map data. So, the characteristics and requirements of watermarking technology of tile map stored by the indexing mechanism are firstly analyzed in this paper, which help to apply the watermark embedding and detecting. Then, the meaningless watermark information is generated by m sequence in the process of watermark generating. On this basis, the watermark algorithm based on mapping mechanism for tile map is presented. The watermark information is embedded into the blue channel using the mapping function generated by the red and green channels. The process of watermark detecting is the inverse process of watermark embedding. Finally, the real-time copyright protection system for tile map is implemented and verified experimentally. By experiments, it has been confirmed that the proposed watermark algorithm is robust to cutting, splicing, and additive noise. Besides, the embedding has little effect on the size of original tile map.

  • ARTICLES
    MA Xiao-Ya, GUO Qiang-Qing
    Download PDF ( )   Knowledge map   Save

    The automatic generalization of linear features is an important aspect of map generalization, for linear features occupy more than 80 percent of the map objects. As a major means of linear features generalization, graphic simplification has been the concern of many scholars at home and abroad, they make many researches to implement a large number of automated algorithms. With the intelligent optimization algorithm widely used in various fields, many scholars have tried to introduce the intelligent optimization algorithm into the field of map generalization. And then several scholars have applied the genetic algorithm and ant colony algorithm to the linear feature graphic simplification. They achieved good results, but some defects, too. The artificial immune system development started relatively late, but it has been widely used in various fields, and has got amazing achievements. In this paper we presented a new linear feature graphics automatically simplified model based on the basic principles of clonal selection algorithm, which is kind of AIS, and analyzed the graphics simplification constraints of linear features data compression, taking into account the geometric precision and the shape keeping function. Then we designed the appropriate coding mechanism, mutation mechanism and affinity function, meanwhile combining with infeasible solutions repair mechanisms to improve the precision of graphic simplification. At the end, we compared the simplification results with the clonal selection algorithm, Douglas algorithm and genetic algorithm. Experiments show that, in the same geometric accuracy, the linear feature graphics simplification model this study proposed gave better performance in keeping the shape of linear features graphics. Experiments also verify the feasibility of the artificial immune system in solving the problem of linear feature graphic simplification.

  • ARTICLES
    DONG Jian, BANG Ren-Can, ZHANG Li-Hua, LI Ning, GU Shuai-Dong
    Download PDF ( )   Knowledge map   Save

    The multi-scale representation of spatial data is one of the most important and difficult problems in the field of GIS. Digital Depth Model (DDM) is the digitized model reflecting the depth change of ocean. As an important representation mode of sea floor relief, DDM is not only the main source information guarantying safety navigation, but also the information platform for marine geoscience research, maritime engineering, subaqueous archaeology and so on. With the development of marine geographic environment, the application fields of DDM are expanding increasingly, which results the requirements for multi-scale representation of DDM. In fact, in the same sea area, DDM of different scale is the different digitized representation of the identical sea floor landform. Therefore, it is an effective approach to multi-scale application by studying a multi-scale representation method based on original DDM. As an important visualization representation mode of digital depth model, bathymetric contour shares the same constraint multi-scale representation principle of DDM. Existing multi-scale representation algorithms of DDM are mostly by dimensional extending of generalization algorithms for two dimensional bathymetric contour graphics. For those multi-scale representation algorithms barely focus on the geometry characteristics of DDM, and simplify DDM just by deleting some grid points and reserving the feature points. While DDM generalization is not a simple process of accepting or rejecting the grid points of DDM, some factors including geographic and scale character of DDM should be considered to maintain the consistency of spatial cognition and abstract grade. Mainly focuses on the geographic and scale character of DDM, based on the analysis of the essential principle of the two dimensional rolling circle transform algorithm, and by means of rolling circle transform dimensional extension, the paper has brought forward a multi-scale representation of DDM based on rolling ball transform algorithm. Namely, by using different size of three-dimensional ball instead of planar circle rolling on the upper surface of DDM, which will preserve the positive relief of DDM and reduce the negative relief of DDM contrarily, and realizes the multi-scale representation of DDM from the viewpoint of guarantying safety navigation. Besides, the paper also expatiates the keystone and solution steps of the algorithm. At last, under the condition of VC++, some experiments have been done to validate the algorithm's validity. The experiments show that the algorithm could preserve the basic characters of the DDM, meanwhile, with high computing efficiency.

  • ARTICLES
    LI Ting, XU Zhu, LEI Cai-Xia, XU Bing, LI Mu-Zi, HUANG Meng-Meng
    Download PDF ( )   Knowledge map   Save

    Road traffic condition constantly changes in both spatial and temporal domains. Traffic information of high spatiotemporal resolution is valuable to the study of road traffic dynamics. However, the massive data volume of high resolution traffic information of large spatiotemporal extent introduces difficulties in data organization and management. Actually, there is no well-accepted spatiotemporal data model specialized for management of such data that is effective in terms of data storage and access efficiencies. To overcome such drawbacks, this paper proposes a base state amendment model (BSAM) for dynamic traffic condition, which is based on linear referencing system. The model utilizes the basic strategy of BSAM to do lossless data compression in the time dimension while compresses data in the spatial domain by utilizing linear referencing system (LRS) and dynamic segmentation. In addition, road stroke instead of road segment is used for route identification in the LRS to further reduce data volume. We validate the effectiveness of the proposed model through its use in traffic condition data modeling in Chengdu urban area. Six variants of the proposed model and their characteristics are analyzed and compared. The six variants of the proposed model are denoted as BSAMa, BSAMb, BSAMc, BSAMd, BSAMe and BSAMf, respectively (see section 3). The main differences between them are in the number of base states, time interval between two base states and the number of non-base states. Data storage and accessing efficiencies of the six variants are qualitatively and quantitatively analyzed. The results indicate that BSAMf (see Fig. 4) has the optimal storage and accessing efficiencies for the data used. Furthermore, a method is suggested to estimate the number of non-base states between successive base states in the BSAMf model, which is based on the dynamic characteristics of the traffic condition data. With the data of Chengdu, it is demonstrated that two non-base states between successive base states is the most appropriate pattern for BSAMf in terms of data storage and accessing efficiencies. In sum, the experiment results demonstrate that the proposed BSAM for traffic condition data is effective and efficient.

  • ARTICLES
    LI Mu-Zi, XU Zhu, LI Zhi-Lin, ZHANG Gong, TI Feng
    Download PDF ( )   Knowledge map   Save

    Generalization of road network is one of the focuses in map generalization. Road network generalization can be considered as the combination of two processes. One is selective omission, and the other is the simplification of selected roads. Selective omission is the key process, in which it is hard to maintain the overall and key local structures of original networks. Many solutions have been proposed for road selective omission. But previous solutions cannot maintain these structures in the process of selective omission. It will solve the problem if we can build the hierarchical structure of road networks and make selection based on the structure. This paper presents a novel method for selective omission. The method first builds the hierarchical structure of road networks. It is based on Hierarchical Random Graph (HRG) which transforms a graph into a dendrogram, which is widely used in complex networks. HRG goes beyond simple clustering and provides clustering information at all levels of granularity for visualization. But HRG is over detailed for multi-scale representation as its dendrogram usually contains tens or even more layers. So, after building HRG of road networks, we propose a measure named Accumulated Probability Number (APN) to simply HRG hierarchy. APN reflects the importance of each road in the whole network. It should be noted that we use road ‘strokes' as vertices and the connections between them as edges when transforming a road network into a graph. The proposed approach is validated with case studies of road network generalization. Different patterns of road networks are considered including grid, ring-star-hybrid, grid-star-hybrid, irregular patterns. The corresponding Google Map is used as the reference for evaluation of road selection. The results of APN-based selection match well with the reference.

  • ARTICLES
    CHEN Jin-Hai, LIU Feng, BANG Guo-Jun, KE Dan-Xuan
    Download PDF ( )   Knowledge map   Save

    In order to monitor real-time vessel information to improve navigation safety, China's Maritime Safety Administration (MSA) has built the world's biggest Automatic Identification System (AIS) shore-based network, in which data such as ship position, name, purpose, course and speed are automatic collected 24 hours per day primarily in Chinese coastal waters. As a result, China is approaching the era of big data storage of vessel trajectories, which has brought great challenges to traditional moving objects data management systems. Beyond their basic functions of loading and displaying vessels' position records, an ideal vessel trajectories database should bring user more advance functions of analysis of ship tracking records by supporting spatio-temporal query and prediction of vessel movement. In this paper, we start with the character of vessel movement and abstract the data model of vessel trajectories according to state-of-the-art technology of moving objects databases. Due to the characteristics of vessel trajectories data, such as changing frequently, wide cover range and mass datum, it is argued that current methods of trajectories storage still deserve much more research and improvements, especially for spatio-temporal query and geoprocessing support methods. Considering the role of time perspective, vessel trajectories are managed by three kind of time unit (sampling instant, stepping period and 24 hours) so as to built a three-level organizational framework. By compressing the volume of data and matching original vessel tracking message into spatio-temporal cube unit the retrieval efficiency increases significantly. It was also described how to streamline the acquisition, loading, filtering, display and analysis of raw AIS log files. This method is applied in handling daily mass vessel tracking records which are covering western Taiwan Strait. Experiences show that this model satisfied the requirements of application. The storage is reduced and the performance of spatio-temporal query is improved. Using ArcGIS platform of Geodatabase module, vessel trajectories' initial data model is easy to revised, expanded and integrated with various relational base-map data. Furthermore, it is convenient to apply variable ArcGIS geoprocessing tools to obtain customize demand, such as daily hot spot activities of fishing vessel. By generating these synthesized products our solutions would support the ocean planning community to better understand marine transportation patterns and potential use conflicts between vessels and other activities.

  • ARTICLES
    TIAN Jiang-Feng, GU Fen-Li, JIA Jing, TUN Jin-Bing
    Download PDF ( )   Knowledge map   Save

    Research of map symbols is an important part of cartography. Currently, the main research of map symbols was focused on the visual graphics but paying little attention to the semantic. This paper puts forward a method of semantic-driven hierarchical map symbols design. In this method the semantic relation of map symbols is as a benchmark to the construction of symbol graphics, and the symbol graphics is controlled by semantic model. So we can fully exploit the intrinsic value of the semantic components of the map symbols in its design activities. We mainly focused on such four key steps of the method. The first one is semantic feature extraction. We systematically summarized the semantic feature of the map symbols by using ontological level concept. Second step is about morphemes design. The concept, design principles of morphemes and its important role in map symbols design was discussed. The third is about modeling of the associative semantic relation. In this step the modeling methods was discussed and a practice that show how to construct an associative semantic model based on common map symbols for the public geographical information was conducted. The fourth is semantics-driven generation of map symbols. The processes and characteristics of a symbol's generation was analyzed. An existing map symbol standard was improved by using our symbol design method. And a group of cognitive experiments have been done which show that the propound method has a superior performance in cognitive efficiency and relatively stable and high transmission efficiency in the analog process of information transmission. In conclusion, the semantic relation of the geospatial objects is the core of the method of semantic-driven hierarchical map symbols design, which is aimed to improve the graphic design and understanding of map symbols. Characterized by the symbol design oriented ontology domain, this method makes the map symbols more semantic-evident for better recognition, understanding and application.

  • ARTICLES
    LIU Hui-Min, DENG Min, HE Tie-Jun, XU Shen
    Download PDF ( )   Knowledge map   Save

    Map is a visualization representation of geospatial entities and their distribution. Users often can obtain large amount of information through reading a map. The measurement of map information content is one of the most important basic research issues in the theory of map information transmission. It has been preliminarily applied to map generalization and many other aspects of map applications. Spatial information of a map contains that of features and their distributions. Existing methods of measuring spatial information content only consider the information content of spatial distribution among the features. In other words, the information content of spatial features is not involved. Therefore, the results of the information content obtained by existing methods are inaccurate. For this purpose, in this paper we focused on the development of a methodology for the information content measurement of individual spatial features, where area features are chosen as an example. As a matter of fact, it has been extensively accepted that geometric shape is deemed to be the carrier of geospatial information content of an area feature. As a result, the convex hull is firstly used for shape decomposition of individual area features and a hierarchical structure called convex hull tree is proposed to represent an area feature from the view of spatial cognition. Secondly, geometric shape of area features is analyzed according to the nodes of convex hull tree at three levels, namely, node level, neighborhood level and global level. Moreover, quantitative indicators at each level are defined and utilized for the description of geometric shape, including edge number as the indicator of shape complexity, and convexity as that of shape pattern at node level, out-degree at neighborhood level and layer at global level as indicators of geometry distributions. Sequentially, the corresponding computational models are respectively developed based on geometry characteristics at three levels, which are further used to measure spatial information content of individual area features. At last, an example is provided to illustrate the rationality and the accuracy of the proposed methods.

  • ARTICLES
    LIU Kai, SHANG Guo-An, DAO Yang, JIANG Ku
    Download PDF ( )   Knowledge map   Save

    Terrain texture is an important natural texture. DEM based terrain texture attracts more attention in the research area for its purity in representing surface topography and its derivability in terrain analysis. In this paper, 10 sample areas from different landform types of Shaanxi Province were selected to make a quantitative analysis on the terrain texture by Gray level co-occurrence matrix (GLCM) model. Experiments show that, when using the DEM data with 25m resolution, the suitable analytic distance of GLCM model is not less than 3 pixels. Among all the parameters in the model, correlation could be used for texture direction detection. Contrast, variance, and different variance could be applied for texture periodicity analysis. Entropy, angular second moment and inverse different moment are suitable for texture complexity investigation. In this research, quantitative analysis is conducted to terrain texture by using DEM data, hillshade data, slope data and curvature data. The terrain texture directivity experiment shows that the correlation of hillshade data reacts sensitively to the terrain texture direction and can detect main terrain texture direction. The correlation of slope data reacts obviously in rugged topography such as hilly region and mountainous regions so it can play an auxiliary role for hillshade data in the detecting of terrain texture direction. Results of terrain texture periodicity and complexity analysis shows that among DEM data and its derived data, the mean variation coefficient of each texture parameter based on hillshade data is the highest, and it further proves that the hillshade data is most suitable for quantitative analysis of terrain texture. Quantification is conducted by variance of hillshade data to texture periodicity of different terrain texture, variance eigenvalue of flat, platform, hill and mountain region gradually increases which indicates the increase of terrain texture periodicity. Analysis is also conducted to the terrain texture complexity through angular second moment parameters computed by hillshade data. Eigenvalue has clear peak value in the sample region of flat and the eigenvalue of platform decreases obviously. Eigenvalue of hills and mountain region verge to zero which shows that texture of plat has lowest complexity, followed by the lower complexity of platform and the highest complexity of hills and mountain region. This paper also proposed a multi-parameter integrated method which employs both comprehensive periodicity and comprehensive complexity in terrain texture quantitative analysis. This method not only reduces replicate analyses but also makes full use of various texture parameter information, it also unifies range through normalization for the convenience of quantitative analysis. The result showed that these two parameters have significant response to the different terrain texture, which shows a great potential in landform recognition and classification.

  • ARTICLES
    JIANG Ling, SHANG Guo-An, LIU Kai, SONG Xiao-Dong, YANG Jian-Yi, ZHANG Gang
    Download PDF ( )   Knowledge map   Save

    As the analysis region becomes wider and accuracy requirement becomes higher, the parallel method is necessary for digital terrain analysis (DTA) which is data-intensive to meet the time response requirement of customs. Local terrain factor, the fundamental parameter of digital terrain analysis, is usually calculated based on the analysis window with a certain radius (the usual value is 3×3). Its calculation result of each pixel is independent and could reflect terrain information. After analyzing of serial algorithm features of local terrain parameter, extensive study on parallel method of local terrain factor is performed in this paper taking slope for example. From the aspect of data parallelism, the strategies of the way of data division, partition granularity model and data fusion of parallel calculation of local terrain factor are analyzed, and the parallel method has been constructed. To verity the correctness and practicality of the parallel method for local terrain factor in this paper, the parallel experiment of slope algorithm is designed by using SRTM DEM with 16 300×17 400 and it has been implemented and tested on a PC cluster system. The experiment results show that: (1) with the increase of process number, the execution time of parallel computing decreases significantly for different partition granularities. When the task number equals to processing node number, the execution time is similar for the whole DEM could be read for computation task by parallel computing system at a time. (2) The parallel speedup of slope algorithm rises gradually with the increase of partition granularity. When the granularity gets growth to a certain value, the changes of speedup and efficiency are basically identical. (3) With the increase of processing node, the execution time of slope algorithm without I/O consumption decreases gradually, meanwhile the change for different granularity is consistent. (4) The main influence factor of execution time is caused by reading and writing data. The efficiency of I/O determines the parallel efficiency to a great extent. So, the research indicates that the parallel method is efficient in completing the parallelization of sequential algorithms of local terrain factor, and the execution efficiency of algorithms could be increased greatly by using the parallel method which processes a good performance. The presentment and implementation of the parallel method can also provide a reference for the parallelization of the algorithm with the similar matrix type data.

  • ARTICLES
    DUAN Ying-Ying, LIU Feng
    Download PDF ( )   Knowledge map   Save

    Urban road traffic is spatially autocorrelated. The change of traffic on certain road will quickly affect the traffic on nearby roads, which will alter the overall traffic status within a neighborhood. Revealing the spatial autocorrelation structure in urban road traffic is important for traffic planning, traffic controlling and traffic guidance. The traffic interaction between neighboring roads is not isotropy. The traffic change on certain road does not equally spread to each spatially adjacent road, but concentrate on some of them. Thus only using spatial adjacency to define adjacent roads cannot well reveal the spatial autocorrelation in urban road traffic. Recent research has proved that the dynamic flow on networks highly depend on the structure of networks. Characterizing the structure of urban road network is essential to reveal the spatial autocorrelation in urban road traffic. The aim of this research is to reveal the spatial autocorrelation of urban road traffic based on road network characterization. We first investigate the modular character and hierarchal feature of urban road network quantitatively. The modulars in road network are defined as a group of closely connected neighboring road segments and identified by community detection algorithm from complex network theory. The hierarchal feature of urban road network helps to determine the structural importance of road segments. Topological roles are defined based on the structural importance of road segment. Then we provide a novel approach to define adjacent road segments based on the topological roles in spatially adjacent road segments. Two road segments defined as adjacent road segments not only locate in a nearby neighborhood but also have the same topological roles. A set of adjacent roads constitute a spatial related set. Experiment results on the road network of Beijing imply that the spatial related sets identified by the proposed approach can capture the spatial autocorrelation structure of urban road traffic.

  • ARTICLES
    LU Xiao-Ya, SONG Zhi-Hao, XU Zhu, LI Mu-Zi, LI Ting, SUN Wei-E
    Download PDF ( )   Knowledge map   Save

    Traffic congestion in urban road network heavily restricts transportation efficiency. Detecting traffic congestions in the spatio-temporal sense and identifying network bottlenecks become an important task in transportation management. Up to now, many traffic congestion detection methods have been proposed, which have focused on the detection of momentary local congestions. Larger-scale, longer-time and regular congestions can't be detected using these methods. That is because congestions have different temporo-spatial scales, and a characteristic is not considered in those methods. This paper proposes a new kind of urban traffic congestion detection method that deals with spatio-temporal extension of congestion. It is based on spatio-temporal clustering analysis of real-time traffic data. By defining a proper spatio-temporal correlation, the classic DBSCAN algorithm is adapted to tackle spatio-temporal clustering. With it we can detect longer time and regular traffic congestion in the spatio-temporal sense. Experiments have been conducted using real traffic condition data of Chengdu to validate the effectiveness of the method. The experiment shows that the proposed method can detect the congestion areas and identify the spatio-temporal extent of congestions accurately. The detected congestion areas were compared with congestion report from local traffic management authority and found to be consistent with the later.

  • ARTICLES
    CHU Wei, FANG Zhi-Xiang, LI Qing-Quan, LU Shi-Wei
    Download PDF ( )   Knowledge map   Save

    Logistic requires designing delivery plan intelligently and quickly. Traditional vehicle routing optimal algorithms can only solve vehicle routing problem no more than 2000 customers, and cost much computing efforts. This paper proposes a fast heuristic algorithm for large scale logistic vehicle routing optimization based on Voronoi neighbors. This algorithm creates an initial solution and improves it iteratively. The creation makes use of the Voronoi neighbors to cluster customers into groups from bottom to up, considering the vehicle capacity constraint. The route in each group is generated by the cheapest insertion algorithm. The improvement employs local search in k-order Voronoi neighbors to search the promising neighbor solution. The simulated annealing criterion is adopted to accept some bad solutions, escaping from local minima. The computational test has been done with a synthetic large scale vehicle routing problem dataset in Beijing. The result indicates the proposed algorithm could solve vehicle routing problem up to 12 000 customers in 4500 seconds. It costs few computational efforts, and has a quite stable performance. Comparison to other heuristics shows the proposed algorithm could provide high quality solution in a short time. The conclusion indicates the proposed algorithm is suitable for large scale logistic vehicle routing optimization.

  • ARTICLES
    BANG Guo-Jun, TUN Yong-Jun, LIU Xiang, KE Dan-Xuan, DU Zhi-Xiu
    Download PDF ( )   Knowledge map   Save

    Along with world shipping economy continues to push forward and the progress and development of science and technology, the number and size of large vessels are enlarging continually. The largest vessel in the world has reached five or six hundred thousand dwt, especially very large oil tankers, bulk carriers and large container vessels. The safe berthing becomes a pressing concern of the pilot and the captain. The workload of harbor is heavier and heavier. However, there is no advanced system assisting pilots to guide large vessels to enter and leave port safely. As a result of that, the pilots are faced with a lot of pressure. In order to improve the security and efficiency of the berthing process of large vessels, lighten the heavy burden of the pilots and decrease the possibility of the accident during the process of pilotage, this paper introduced the berthing system's history and development, analyzed large ships' berthing process, researched the technologies of the DGPS combination positioning, WIFI and AIS, built up the ship-based plane coordinate system, shaped the large vessels' berthing mathematical model based on the technology of DGPS, provided the means of calculating the dynamic and static parameters and then gave the structure of the whole system. In this ship-based plane coordinate system the model can accurately calculate the berthing ship's dynamic parameters, such as the exact position of the vessel, the distance between the vessel and the pier, the relative velocity between the bow and the pier, etc. With the help of the combination technologies of AIS, WIFI and two-point positioning technology DGPS, the large vessels' berthing system which has passed the test of reliability and availability is finally designed. In the test, it can be shown that this system could provide high-precision information such as the latitude and longitude, movement speed, the distance to the wharf line and other auxiliary information which the pilot need in the progress of pilotage. The accuracy of the position can be within 60 cm, and the speed accuracy can be within 5 m/s, which provides more intuitive, convenient, and fast sailing navigation for the pilot and then effectively improves the security and efficiency of large vessel's berthing progress.

  • ARTICLES
    JIANG Zhi-Ben, BAI Jian-Jun, CA Dun, LI Rui-Yun, JIN Shen-Yu, XU Bing
    Download PDF ( )   Knowledge map   Save

    In March of 2009, a novel swine-origin influenza A(H1N1) virus was first discovered in Mexico and quickly spread to over 200 countries in less than two years. However, limited research has been conducted on the characterization of the global spatio-temporal transmission of the pandemic. Applying Ripley's K function based on the spherical distances, we analyzed spatial pattern of the outbreaks of the H1N1 pandemic from March 15, 2009 to June 9, 2009. Compared with other type A influenza occurred during 2000-2008, the 2009 H1N1 influenza showed generally similar temporal trend, but marked difference when we broke down the outbreak data of each country along the latitude. To look into the differences, we further associated the number of weekly cases of the H1N1 influenza with national arrivals through customs. Results show that the 2009 H1N1 influenza in early period was spatially clustered. The maximum value of the function L was identical to that of the 65 global cities, within which 79 percent of the outbreaks were distributed within a radius of 600 km. In addition, the correlation coefficients show that the highest positive correlation (r=0.7,p=.002) between national arrivals and weekly influenza cases lied in the 19th week. These findings suggest that global cities are the key nodes of the network which disseminates international travels, hence the viruses in the early period of the pandemic. It was found that the seasonal environmental factors also have impact on the influenza pandemic through applying time series analysis. Unexpectedly, some countries in the northern temperate zone reported more confirmed human cases in June and July when was thought not to be suitable for the transmission of the influenza. In the meantime, the winter peaks of cases for the countries that lie to the north of the tropic of cancer are clustered around the period between the 45th week and the 48th week, which is earlier than the common type A influenza season. It might partially due to the lack of immunity among the population against the pandemic A(H1N1)2009 virus.

  • ARTICLES
    FENG Zhe, TUN Jian-Sheng, GAO Yang, BANG Jian, ZONG Min-Li, WANG Zheng
    Download PDF ( )   Knowledge map   Save

    Landscape multifunction is a hotspot in the fields of landscape ecology. In order to explore a method which can reflect both integrity and independence of landscape multifunction, this research focuses on the clustering of landscape functions, taking Beijing and its peripheral area, China, as the study area. Five landscape function intensities, material production, carbon storage, soil retention, habitat conservation, and population support, are calculated using a variety of ecological models and indices in a grid map. Then, based on the results of landscape multi-function calculation, the study area are clustered through self-organizing feature map model. The quantitative results show that different regions turned out to have different and relative unique effects on the regional priority functions. Beijing and its peripheral area can be divided into four landscape function regions: agricultural region, whose dominant function is material production; urban region, whose dominant functions are population support and carbon storage; ecological region, whose dominant functions are soil retention and habitat conservation; and transition region, which does not have dominant functions, but reflects the interaction between human and nature. The validation of the results also shows that the presented SOFM neural network model is an effective and appropriate method for cluster analysis. Clustering results based on the SOFM model exhibit significant regional heterogeneity, with notable regional differences in the four clustering types within the research area. This spatial comprehensive dataset, combined with the independence from mechanistic ecological assumptions of the SOFM network approach provides a unique opportunity to validate and assess modeling efforts. The dominant landscape functions influencing regional development differ from one area to another. Furthermore, characteristics of the landscape indices and functions vary with region. Despite its limitations and uncertainty, the application of the presented method on clustering landscapes function using the SOFM model organization in connection with high performance computers is encouraged as a very interesting and important goal for future studies. The approaches to achieve sustainable regional development were illustrated and their importance highlighted for policy makers and stakeholders.

  • ARTICLES
    FENG Mei-Chen, NIU Bei, YANG Wu-De, XIAO Lu-Ji
    Download PDF ( )   Knowledge map   Save

    Buckwheat is mainly distributed in central and western regions in China, is one of the major food crops and economic crops in these areas. The production and development of buckwheat have direct impact on farmers' income and agricultural economic development of these areas. The climate regionalization of buckwheat quality is the important basis of optimizing cultivation environment and quality variety layout, and has a great guiding meaning towards high quality production of buckwheat. So, exploring the best planting pattern of buckwheat can provide theoretical and practical basis for the regionalization and high quality buckwheat production at larger scales. But the current climate regionalization studies of buckwheat mainly focused on the suitable cultivation division using traditional research methods, and no report about climatic regionalization of buckwheat quality using principal components analysis (PCA) and GIS was found. In this study, the correlation of quality index of buckwheat and meteorological factors was analyzed, together with the geographical distribution information of buckwheat and the major meteorological factors which effected on quality index were screened out. The model was established based on PCA method in order go assess the comprehensive quality of buckwheat, then the ecological regionalization was determined using the ArcGIS spatial analysis technique. The results showed that temperature, rainfall and sunshine-hour were the main ecological factors which effected buckwheat quality index. It had bad effect on buckwheat growth and went against accumulate of quality index, if the daily highest temperature was more than 35℃ and mean temperature was higher. A plenty of precipitation in August and plenteous sunshine of whole stage were beneficial to accumulate of quality index. Combined with evaluation model, using GIS, Jinzhong Prefecture was divided into three regions, i.e. adapted, inferior adapted and bad adapted planting region, and it was largely in line with the actual condition. PCA combined with GIS method is the most powerful approach for regionalization, and the quantitative computing and qualitative analysis are integrated. So, there is a feasibility to carry through ecological regionalization of buckwheat quality using PCA method and GIS, and the regionalization results are objective and scientific. The method is concise, applicable and effective, which can really reflect the regionalization actual situation and provide reference for fine quality production of buckwheat.