Projects

Research & Publications

Authors:
Lam H and Tang V.

Abstract:
During the pandemic, the attention and demand for cold chain increased owing to considerable use of low-temperature logistics in transporting perishable goods and vaccines. To ensure the shipping performance for reduced damage, logistics companies are required to track continually and repetitively the status of shipments daily. However, typing various air waybills for searching the shipping status is a cause of frequent errors. Also, tracking the shipping status is labor-intensive, resource intensive, inefficient and repetitive. Moreover, repetitive tasks result in low employee satisfaction. Therefore, robotic process automation (RPA) applications have gained the attention of practitioners in the cold chain logistics industry. This study contributes to (i) determining possible areas requiring automation through the workflow study on cold chain logistics and (ii) streamlining the operation by the develop a robotic process automation bots. A case study tested and evaluated the performance of two unattended RPA bots applied in a freight forwarder company to check shipment status and temperature conditions. The results determined that implementing RPA in the workflow reduces significant data processing time. With the implementation of proposed RPA bots, the company can better comprehend its shipping performance of logistics and can get an immediate notification from RPA bots when an abnormal situation occurs with regard to a shipment.

Authors:
Lam, H. Y., Ho, G. T. S., Mo, D. Y., & Tang, V.

Abstract:
In the rapidly growing e-commerce industry, pallet picking is no longer feasible as the stock keeping units (SKUs) change from the pallet level to the carton or item level. Thus, an effective order-picking process focusing on fast and efficient retrieval of SKUs from shelves is required to fulfil numerous small lot-sized e-commerce orders within a short time. This study investigates a responsive pick face replenishment (RPFR) strategy that divides the high-bay racks in the distribution centres (DCs) into two parts: the upper-deck reserve areas and the pick-face forward areas to improve the operational efficiency in order picking. To address the fluctuating order demand and limited space in the pick-face forward areas, the proposed RPFR system integrates a predictive analytics algorithm with an adaptive network-based fuzzy inference system (ANFIS) and adaptive genetic algorithm-based stock allocation model to generate an optimal stock replenishment plan. By predicting the order demand of each SKU in the next time interval, the types of selected SKUs and their quantities to be loaded into the pick-face forward areas are determined. Numerical experiments are performed to validate the system performance, and comparative analyses are conducted to determine the best parameter settings for the models.

Authors:
Ng, T. C., Choy, S. K., Lam, S. Y., & Yu, K. W.

Abstract:
This article presents a multi-phase image segmentation methodology based on fuzzy superpixel decomposition, aggregation and merging. First, a collection of layers of dense fuzzy superpixels is generated by the variational fuzzy decomposition algorithm. Then a layer of refined superpixels is extracted by aggregating various layers of dense fuzzy superpixels using the hierarchical normalized cuts. Finally, the refined superpixels are projected into the low dimensional feature spaces by the multidimensional scaling and the segmentation result is obtained via the mean-shift-based merging approach with the spatial bandwidth adjustment strategy. Our algorithm utilizes the superimposition of fuzzy superpixels to impose more accurate spatial constraints on the final segmentation through the fuzzy superpixel aggregation. The fuzziness of superpixels also provides spatial features to measure affinities between fuzzy superpixels and refined superpixels, and guide the merging process. Comparative experiments with the existing approaches reveal a superior performance of the proposed method.

Authors:
Ng, S. C. H., Ho, G. T. S., & Wu, C. H.

Abstract:
The literature on quality management system (QMS) assumes that product and process performance data are authentic and easily accessible. This assumption, while ideologically sound, is questionable in practise because the authenticity and accessibility of data cannot be guaranteed in many circumstances. Inaccurate, incomplete, inconsistent, and inaccessible data are common in supply chains and prevent the QMS from achieving its goal: assuring product and process quality to meet customer requirements. This study is one of the first to examine the impact of data quality and data latency on process control and quality analysis which are elemental parts of daily QMS activities, from a supply chain visibility (SCV) perspective. In this study, five propositions are made to show the relationships between technology, SCV, and data issues. More importantly, the study proposes a platform that integrates Blockchain (BC) technology, Industrial Internet of Things (IIoT), and Big Data to solve data problems in SCV and QMS. We further perform fuzzy association rule mining (FARM) to show how the platform can solve quality analysis problems and complete a closed-loop process control cycle in manufacturing. We also explain the contributions of the integrated platform to QMS from four theoretical perspectives. Finally, we discuss the limitations of the platform and provide recommendations for future research.

Authors:
Li, X. J., Tian, G. L., Zhang, M., Ho, G. T. S., & Li, S.

Abstract:
One of the main concerns in multidimensional item response theory (MIRT) is to detect the relationship between observed items and latent traits, which is typically addressed by the exploratory analysis and factor rotation techniques. Recently, an EM-based L1-penalized log-likelihood method (EML1) is proposed as a vital alternative to factor rotation. Based on the observed test response data, EML1 can yield a sparse and interpretable estimate of the loading matrix. However, EML1 suffers from high computational burden. In this paper, we consider the coordinate descent algorithm to optimize a new weighted log-likelihood, and consequently propose an improved EML1 (IEML1) which is more than 30 times faster than EML1. The performance of IEML1 is evaluated through simulation studies and an application on a real data set related to the Eysenck Personality Questionnaire is used to demonstrate our methodologies.Under-dispersed count data often appear in clinical trials, medical studies, demography, actuarial science, ecology, biology, industry and engineering. Although the generalized Poisson (GP) distribution possesses the twin properties of under- and over-dispersion, in the past 50 years, many authors only treat the GP distribution as an alternative to the negative binomial distribution for modeling over-dispersed count data. To our best knowledge, the issues of calculating maximum likelihood estimates (MLEs) of parameters in GP model without covariates and with covariates for the case of under-dispersion were not solved up to now. In this paper, we first develop a new minimization–maximization (MM) algorithm to calculate the MLEs of parameters in the GP distribution with under-dispersion, and then we develop another new MM algorithm to compute the MLEs of the vector of regression coefficients for the GP mean regression model for the case of under-dispersion. Three hypothesis tests (i.e., the likelihood ratio, Wald and score tests) are provided. Some simulations are conducted. The Bangladesh demographic and health surveys dataset is analyzed to illustrate the proposed methods and comparisons with the existing Conway–Maxwell–Poisson regression model are also presented.

Authors:
Shang, L., Xu, P. F., Shan, N., Tang, M. L., & Ho, G. T. S.

Abstract:
One of the main concerns in multidimensional item response theory (MIRT) is to detect the relationship between observed items and latent traits, which is typically addressed by the exploratory analysis and factor rotation techniques. Recently, an EM-based L1-penalized log-likelihood method (EML1) is proposed as a vital alternative to factor rotation. Based on the observed test response data, EML1 can yield a sparse and interpretable estimate of the loading matrix. However, EML1 suffers from high computational burden. In this paper, we consider the coordinate descent algorithm to optimize a new weighted log-likelihood, and consequently propose an improved EML1 (IEML1) which is more than 30 times faster than EML1. The performance of IEML1 is evaluated through simulation studies and an application on a real data set related to the Eysenck Personality Questionnaire is used to demonstrate our methodologies.

Authors:
Yung, K. L., Tsang, Y. P., Wu, C. H., & Ip, W. H.

Abstract:
In recent aerospace missions, space logistics have proven essential in storing, delivering and returning crew and materials between terrestrial facilities and space stations. Unlike classical commercial logistics, space logistics operations are cost-prohibitive and mission-driven, and its replenishment cycle for essential materials is relatively long. Therefore, the complete utilisation of spacecraft payload is of utmost importance. The theory of the inventory packing problem is extended in this study to build autonomous agents that interact with one another within a space logistics decision support system to reinforce the replenishment decision, chunk loading optimisation, and quality inspection. With the long replenishment cycle time, an agent embedded with interval type-2 fuzzy logic is explored to support chaotic time-series demand forecasting to derive re-order quantities in the desired period. Afterwards, the second agent solves the space chunk loading problem using the differential evolution algorithm to utilise payloads and capacities, particularly cylindrical chunks fully. The third agent measures actual item dimensions and quality to deploy the three-dimensional object scanning devices. Feedback is provided to the second agent to derive optimal chunk-loading instructions. Thanks to the autonomous interactions among the above agents, mission-critical decisions for space logistics are supported to achieve operational excellence.

Authors:
Ho, G. T. S., Tang, Y. M., Lam, H. Y., & Tang, V.

Abstract:
The rapidly growing e-commerce sector has created new opportunities and challenges to the logistics industry. Nonetheless, the majority of Hong Kong logistics industry, especially small-and-medium (SMEs) firms, lack operational decision support to adequately handle the seasonal, fragmented, and fluctuating e-commerce orders. To grasp the opportunities of the e-commerce logistics business, logistics service providers (LSPs) should enhance their capability in information exchange and operational planning. With these improvements, the logistics industry would be better able to sustain and expand their e-commerce logistics business. In this paper, a Blockchain-based E-Commerce Analytics Model is developed to enhance digital supply chain integration. Firstly, timely operational decision support would be achieved as blockchain technology, after that, the ML algorithm would enable the logistics industry to manage data efficiently and to forecast dynamic e-commerce order demand. Subsequently, the proposed model allows LSPs to flexibly re-allocate the right number of resources in real time to deal with the hour-to-hour fluctuating arrival of orders in distribution centers. Additionally, the proposed model enables logistics practitioners to predict the sales performance related to e-commerce.

Authors:
Lam, H. Y., Tang, V., & Ho, G. T. S.

Abstract:
Due to the outbreak of COVID-19, increasing attention has been paid to designing a cold chain logistics mechanism to ensure the quality of vaccine delivery. In this study, a cold chain digital twins-based risk analysis model is constructed to handle and monitor the vaccine delivery process with a high level of reliability and traceability. The model integrates the Internet of Things (IoT) and digital twins to acquire data on environmental conditions and shipment movements and connect physical cold chain logistics to the digital world. Through the simulation of cold chain logistics in a virtual environment, the risk levels relating to physical operations at a certain forecast horizon can be predicted beforehand, to prevent a “broken” cold chain. The result of this investigation will reshape the cold chain in the digital age, benefit society in terms of sustainability and environmental impact, and hence contribute to the development of cold chain logistics in Hong Kong.

Authors:
Men, J., Hou, Y., Sheng, Z., & Chan, T. T.

Abstract:
The high mobility feature of vehicular networks poses tremendous challenges to maintaining network connectivity. In this paper, we investigate the possibility of enhancing the connectivity of Cellular Vehicle-to-Everything (C-V2X) networks through distributed trajectory adjustment. Based on a physical layer abstraction model, we characterize the network connectivity enhancement problem as a network utility maximization and study its concavity. We propose a distributed trajectory updating algorithm that dynamically adjusts the trajectory of vehicles on top of their planned trajectory. The algorithm is distributed and requires only geo-location exchanges, which are readily available in V2X networks. Simulation results show that the mobility updating algorithm converges and improves the aggregated network utility by up to 48% compared to the scenarios without mobility tuning.

Authors:
Wang, Q., Mei, X., Liu, H., Leung, Y. W., Li, Z., & Chu, X.

Abstract:
Energy conservation of large data centers for high performance computing workloads, such as deep learning with Big Data, is of critical significance, where cutting down a few percent of electricity translates into million-dollar savings. This work studies energy conservation on emerging CPU-GPU hybrid clusters through dynamic voltage and frequency scaling (DVFS). We aim at minimizing the total energy consumption of processing a batch of offline tasks or a sequence of real-time tasks under deadline constraints. We derive a fast and accurate analytical model to compute the appropriate voltage/frequency setting for each task, and assign multiple tasks to the cluster with heuristic scheduling algorithms. In particular, our model stresses the nonlinear relationship between task execution time and processor speed for GPU-accelerated applications, for more accurately capturing real-world GPU energy consumption. In performance evaluation driven by real-world power measurement traces, our scheduling algorithm shows comparable energy savings to the theoretical upper bound. With a GPU scaling interval where analytically at most 36% of energy can be saved, we record 33-35% of energy savings. Our results are applicable to energy management on modern heterogeneous clusters.

Authors:
Tsang, Y. P., Wu, C. H., Ip, W. H., & Lee, C. K. M.

Abstract:
Solder paste printing (SPP) is one of the critical processes for the printed circuit board assembly to reliably apply the solder paste on raw PCBs for the component placement through surface mount technology. Although computerised SPP machines have been developed in the past few years, the reliance on domain experts cannot be neglected to fine-tune corresponding process parameters so as to maintain productivity and quality. This study exploits federated learning on the industrial internet of things (IIoT) paradigm to establish an intelligent decision support system across various networked machines. The IIoT-based squeegee blade is deployed in the SPP machines for better machine-to-machine communication and interconnectivity. In contrast, a global machine intelligence model is aggregated in a decentralised and privacy-preserving manner. Consequently, the automated and sustainable manufacturing management for PCBA is achieved, where wastes from trial production runs are eliminated.

Authors:
Ho, G. T. S., Choy, S. K., Tong, P. H., & Tang, V.

Abstract:
Purpose
Demand forecast methodologies have been studied extensively to improve operations in e-commerce. However, every forecast inevitably contains errors, and this may result in a disproportionate impact on operations, particularly in the dynamic nature of fulfilling orders in e-commerce. This paper aims to quantify the impact that forecast error in order demand has on order picking, the most costly and complex operations in e-order fulfilment, in order to enhance the application of the demand forecast in an e-fulfilment centre.

Design/methodology/approach
The paper presents a Gaussian regression based mathematical method that translates the error of forecast accuracy in order demand to the performance fluctuations in e-order fulfilment. In addition, the impact under distinct order picking methodologies, namely order batching and wave picking. As described.

Findings
A structured model is developed to evaluate the impact of demand forecast error in order picking performance. The findings in terms of global results and local distribution have important implications for organizational decision-making in both long-term strategic planning and short-term daily workforce planning.

Originality/value
Earlier research examined demand forecasting methodologies in warehouse operations. And order picking and examining the impact of error in demand forecasting on order picking operations has been identified as a research gap. This paper contributes to closing this research gap by presenting a mathematical model that quantifies impact of demand forecast error into fluctuations in order picking performance.

Authors:
Mo, D. Y., Ma, C. Y., Ho, D. C., & Wang, Y.

Abstract:
Despite that reverse logistics of service parts enables the reuse of failed components to achieve greater environmental and economic benefits, the research and successful business cases are inadequate. This study designs a novel reverse logistics system that applies the Internet of Things (IoT) and business intelligence to streamline the reverse logistics process by identifying the appropriate components for sustainable operations of component reuse. Furthermore, an inventory classification scheme and an analytical model are developed to identify the failed components for refurbishment by considering return quantity of the failed component, repair rate of the failed component in the repairing center, reusable rate of refurbished parts, corresponding costs, and the benefit of refurbished parts. Moreover, a mobile application powered by the IoT technology is developed to streamline the process flow and avoid collection of fake components. Lastly, a case study of an electronic product company is conducted, and it is concluded that the proposed approach enabled the company to facilitate the reuse of components and achieve the benefit of cost saving. The results of this study demonstrate the importance of a reverse logistics system for companies to sustain after-market service operations.

Authors:
Liu, H., Yu, L., Poon, C. K., Lin, Z., Leung, Y. W., & Chu, X.

Abstract:
In cognitive radio networks, rendezvous is a fundamental operation by which cognitive users establish communication links. Most of existing works were devoted to shortening the time-to-rendezvous (TTR) but paid little attention to qualities of the channels on which rendezvous is achieved. In fact, qualities of channels, such as resistance to primary users’ activities, have a great effect on the rendezvous operation. If users achieve a rendezvous on a low-quality channel, the communication link is unstable and the communication performance is poor. In this case, re- rendezvous is required which results in considerable communication overhead and a large latency. In this paper, we first show that actual TTRs of existing rendezvous solutions increase by 65.40-104.38% if qualities of channels are not perfect. Then we propose a Quality-Aware Rendezvous Framework (QARF) that can be applied to any existing ren-dezvous algorithms to achieve rendezvous on high-quality channels. The basic idea of QARF is to expand the set of available channels by selectively duplicating high-quality channels. We prove that QARF can reduce the expected TTR of any rendezvous algorithm when the expanded ratio λ is smaller than the threshold (−3+√1+4(σ/μ)^2)/2, where μ and σ, respectively, are the mean and the standard deviation of qualities of channels. We further prove that QARF can always reduce the expected TTR of Random algorithm by a factor of 1+(σ/μ)^2. Extensive experiments are conducted and the results show that QARF can significantly reduce the TTRs of the existing rendezvous algorithms by 10.50-51.05 % when qualities of channels are taken into account.

Authors:
Dong, N., Qin, M., Chang, J., Wu, C. H., Ip, W. ., & Yung, K. .

Abstract:
Smart living is an emerging technology that has attracted a lot of attention all around the world. As a key technology of smart space, which is the principal part of smart living, the SLAM system has effectively expanded the ability of space intelligent robots to explore unknown environments. Loop closure detection is an important part of SLAM system and plays a very important role in eliminating cumulative errors. The SLAM system without loop closure detection is degraded to an odometer. The state estimation solely relying on an odometer will be seriously deviated in the long-term and large-scale navigation and positioning. This paper proposes a metric learning method that uses deep neural networks for loop closure detection based on triplet loss. The map points obtained by metric learning are fused with all map points in the current keyframe, and the map points that do not meet the filtering conditions are eliminated. Based on the Batch Hard Triplet loss, the weighted triplet loss function avoids suboptimal convergence in the learning process by applying weighted value constraints. At the same time, considering that fixed boundary parameters cannot be well adapted to the diversity of scales between different samples, we use the semantic similarity of anchor samples and negative samples to redefine boundary parameters. Finally, a SLAM system based on metric learning is constructed, and the SLAM dataset TUM and KITTI are used to evaluate the proposed model’s accuracy rate and recall rate. The scene features in this method are extracted automatically through neural networks instead of being artificially set. Finally, a high-precision closed-loop detection method based on weight adaptive triple loss is effectively realised through the closed-loop detection experiment. The minimum relative pose error is 0.00048 m, which is 15.8% less than that of the closed-loop detection algorithm based on the word bag model.

Authors:
Mo, D. Y., Tang, Y. M., Wu, E. Y., & Tang, V.

Abstract:
Electronic assessment (e-assessment) is an essential part of higher education, not only used to manage a large class size of students’ learning performance and particularly in assessing the learning outcomes of students. The e-assessment data generated can not only be used to determine students’ study weaknesses to develop strategies for teaching and learning, but also in the development of essential teaching and learning pedagogies for online teaching and learning. Despite the wider adoption of Information and Communication Technology (ICT) technologies due to the COVID-19 pandemic, universities still encountered numerous problems during the transformation to electronic teaching as most educators struggled with the effective implementation of the Electronic Assessment System (EAS). The successful launch of EAS relied heavily on students’ use intention towards the new and unfamiliar electronic system, which was actually unknown to the project managers of EAS. It is therefore important to understand students’ views and concerns on EAS and the proactive measures taken by universities to enhance students’ acceptance and intention of usage. Although most studies investigate students’ acceptance of online learning, there is still little research on the adoption of e-assessment. In this regard, we propose to develop a theoretical model based on students’ perceptions of EAS. Based on the Technology Acceptance Model (TAM) and a major successor of TAM, an electronic assessment system acceptance model (EASA model) is developed with key measures including system adoption anxiety, e-assessment facilitation, risk reduction amid, etc. The data is obtained through a survey among current students at a local university, and structural equation modeling (SEM) is applied to analyze the quantitative data. This study has a significant impact on improving educators’ use of e-assessment in order to develop essential online teaching and learning pedagogy in the future.

Authors:
Mo, D. Y., Wang, Y., Ho, D. C. K., & Leung, K. H.

Abstract:
Service parts management has the potential to generate high profits for companies that deliver superior service parts services in the after-sale market. However, a big challenge in managing service parts operations is to meet the high expectations of service levels and to reduce excess inventories caused by fluctuating demand and a complex service parts logistics network structure. By expanding the conventional inventory management that passively focuses on the forward and lateral flows of service parts deployment, we propose a crucial but overlooked practice of inventory redeployment as an integral part of the operations that allow the proactive management of lateral and reverse flows of service parts. We formulate the service parts inventory problem with the application of an excess inventory redeployment strategy in a multi-echelon service network as a multi-period integer programming model. This optimisation model is evaluated using a case study of an international company’s service parts operations and demonstrates a higher cost-saving potential. Our novel, integrated approach confers the advantage of redeploying excess inventories in a closed-loop service parts logistics network with a higher cost-saving potential that could not have been achieved in a conventional approach.

Authors:
Tang V., Lam H.Y., Wu C.H. and Ho G.T.S.

Abstract:
Due to the increasing ageing population, how can caregivers effectively provide long-term care services to meet the older adults’ needs with finite resources is emerging. In addressing this issue, nursing homes are striving to adopt smart health with the internet of things and artificial intelligence to improve the efficiency and sustainability of healthcare. This study proposed a two-echelon responsive health analytic model (EHAM) to deliver appropriate healthcare services in nursing homes under the Internet of Medical Things environment. A novel care plan revision index is developed using a dual fuzzy logic approach for multidimensional health assessments, followed by care plan modification using case-based reasoning. The findings reveal that EHAM can generate patient-centred long-term care solutions of high quality to maximise the satisfaction of nursing home residents and their families. Ultimately, sustainable healthcare services can be within the communities.

Authors:
Tang Y.M., Ho G.T.S., Lau Y.Y. and Tsui S.Y.

Abstract:
In the context of the global economic slowdown, demand forecasting, and inventory and production management have long been important topics to the industries. With the support of smart warehouses, big data analytics, and optimization algorithms, enterprises can achieve economies of scale, and balance supply and demand. Smart warehouse and manufacturing management is considered the culmination of recently advanced technologies. It is important to enhance the scalability and extendibility of the industry. Despite many researchers having developed frameworks for smart warehouse and manufacturing management for various fields, most of these models are mainly focused on the logistics of the product and are not generalized to tackle the specific manufacturing problem facing in the cyclical industry. Indeed, the cyclical industry has a key problem: the big risk which high sensitivity poses to the business cycle and economic recession, which is difficult to foresee. Despite many inventory optimization approaches being proposed to optimize the inventory level in the warehouse and facilitate production management, the demand forecasting technique is seldom focused on the cyclic industry. On the other hand, management approaches are usually based on the complex logistics process instead of integrating the inventory level of the stock, which is very crucial to composing smart warehouses and manufacturing. This research study proposed a digital twin framework by integrating the smart warehouse and manufacturing with the roulette genetic algorithm for demand forecasting in the cyclical industry. We also demonstrate how this algorithm is practically implemented for forecasting the demand, sustaining manufacturing optimization, and achieving inventory optimization. We adopted a small-scale textile company case study to demonstrate the proposed digital framework in the warehouse and demonstrate the results of demand forecasting and inventory optimization. Various scenarios were conducted to simulate the results for the digital twin. The proposed digital twin framework and results help manufacturers and logistics companies to improve inventory management. This study has important theoretical and practical significance for the management of the cyclical industry.

Authors:
Tsang, Y. P., Wu, C. H., Lin, K. Y., Tse, Y. K., Ho, G. T. S., & Lee, C. K. M.

Abstract:
New product development to enhance companies’ competitiveness and reputation is one of the leading activities in manufacturing. At present, achieving successful product design has become more difficult, even for companies with extensive capabilities in the market, because of disorganisation in the fuzzy front end (FFE) of the innovation process. Tremendous amounts of information, such as data on customers, manufacturing capability, and market trend, are considered in the FFE phase to avoid common flaws in product design. Because of the high degree of uncertainties in the FFE, multidimensional and high-volume data are added from time to time at the beginning of the formal product development process. To address the above concerns, deploying big data analytics to establish industrial intelligence is an active but still under-researched area. In this paper, an intelligent product design framework is proposed to incorporate fuzzy association rule mining (FARM) and a genetic algorithm (GA) into a recursive association-rule-based fuzzy inference system to bridge the gap between customer attributes and design parameters. Considering the current incidence of epidemics, such as the COVID-19 pandemic, communication of information in the FFE stage may be hindered. Through this study, a recursive learning scheme is established, therefore, to strengthen market performance, design performance, and sustainability on product design. It is found that the industrial big data analytics in the FFE process achieve greater flexibility and self-improvement mechanism on the evolution of product design.

Authors:
Dong, N., Zhai, M. D., Chang, J. F., & Wu, C. H.

Abstract:
As important immune cells in the human body, white blood cells play a very significant role in the auxiliary diagnosis of many major diseases. Clinically, changes in the number and morphology of white blood cells and their subtypes are the prediction index for important, serious diseases, such as anaemia, malaria, infections, and tumours. The application of image recognition technology and cloud computing to assist in medical diagnosis is a hot topic in current research, which we believe have great potential to further improve real-time detection and improve medical diagnosis. This paper proposes a novel automatic classification framework for the recognition of five subtypes of white blood cells, in the hope of contributing to disease prediction. First, we present an adaptive threshold segmentation method to deal with blood smear images with nonuniform colour and uneven illumination. The method is designed based on colour space information and threshold segmentation. After successfully separating the white blood cell from the blood smear image, a large number of features, including geometrical, colour, and texture features are extracted. However, redundant features can affect the classification speed and efficiency, and in view of that, a feature selection algorithm based on classification and regression trees (CART) is designed to successfully remove irrelevant and redundant features from the initial features. The selected prominent features are fed into a particle swarm optimisation support vector machine (PSO-SVM) classifier to recognise the types of white blood cells. Finally, to evaluate the performance of the proposed white blood cell classification methodology, we build a white blood cell data set containing 500 blood smear images for experiments. The proposed methodology achieves 99.76% classification accuracy, which well demonstrates its effectiveness.

Authors:
Lam, H. Y., Tsang, Y. P., Wu, C. H., & Tang, V.

Abstract:
For companies to gain competitive advantage, an effective customer relationship management (CRM) approach is necessary. Based on customer purchase behaviour and ordering patterns, companies can be classified into different categories in terms of providing customised sales and promotions for customers. However, companies that lack an effective CRM strategy can only offer the same sales and marketing strategies to all customers. Furthermore, the traditional approach to managing customers is control via a centralised method, in which the information regarding customer segmentation is not shared among the customer network. Consequently, valuable customers may be neglected, resulting in the loss of customer loyalty and sales orders, and the weakening of trust in the customer–company relationship. This paper designs an integrated data analytic model (IDAM) in a peer-to-peer cloud, integrating RFM-based k-means clustering algorithm, analytical hierarchy processing and fuzzy logic to divide customers into different segments and hence formulate a customised sales strategy. A pilot study of IDAM is conducted in a trading company specialised in providing advanced manufacturing technology to demonstrate how IDAM can be applied to formulate an effective sales strategy to attract customers. Overall, this study explores the effective deployment of CRM into the peer-to-peer cloud so as to facilitate sales strategy formulation and trust between customers and companies in the network.

Authors:
Long, W., Wu, C. H., Tsang, Y. P., & Chen, Q.

Abstract:
Pallet pooling is regarded as a sustainable and cost-effective measure for the industry, but challenging to advocate due to weak data and pallet authentication.In order to establish trust between end-users and pallet pooling services, we propose an end-to-end, bidirectional authentication system for transmitted data and pallets based on blockchain and Internet-of-things (IoT) technologies. In addition, secure data authentication fosters the pallet authenticity in the whole supply chain network, which is achieved by considering the tag, location, and object-specific features. To evaluate the object-specific features, the scale invariant feature transform (SIFT) approach is adopted to match key-points and descriptors between two pallet images. According to the case study, it is found that the proposed system provides a low bandwidth blocking rate and a high probability of restoring complete data payloads. Consequently, positive influences on end-user satisfaction, quality of service, operational errors, and pallet traceability are achieved through the deployment of the proposed system.

Authors:
Tsang, Y. P., Wu, C. H., Ip, W. H., & Shiau, W. L.

Abstract:
Purpose
Due to the rapid growth of blockchain technology in recent years, the fusion of blockchain and the Internet of Things (BIoT) has drawn considerable attention from researchers and industrial practitioners and is regarded as a future trend in technological development. Although several authors have conducted literature reviews on the topic, none have examined the development of the knowledge structure of BIoT, resulting in scattered research and development (R&D) efforts.

Design/methodology/approach
This study investigates the intellectual core of BIoT through a co-citation proximity analysis–based systematic review (CPASR) of the correlations between 44 highly influential articles out of 473 relevant research studies. Subsequently, we apply a series of statistical analyses, including exploratory factor analysis (EFA), hierarchical cluster analysis (HCA), k-means clustering (KMC) and multidimensional scaling (MDS) to establish the intellectual core.

Findings
Our findings indicate that there are nine categories in the intellectual core of BIoT: (1) data privacy and security for BIoT systems, (2) models and applications of BIoT, (3) system security theories for BIoT, (4) frameworks for BIoT deployment, (5) the fusion of BIoT with emerging methods and technologies, (6) applied security strategies for using blockchain with the IoT, (7) the design and development of industrial BIoT, (8) establishing trust through BIoT and (9) the BIoT ecosystem.

Originality/value
We use the CPASR method to examine the intellectual core of BIoT, which is an under-researched and topical area. The paper also provides a structural framework for investigating BIoT research that may be applicable to other knowledge domains.

Authors:
Wang, T., Zuo, H., Wu, C. H., & Hu, B.

Abstract:
The estimation of the difference between the new competitive advantages of China’s export and the world’s trading powers have been the key measurement problems in China-related studies. In this work, a comprehensive evaluation index system for new export competitive advantages is developed, a soft-sensing model for China’s new export competitive advantages based on the fuzzy entropy weight analytic hierarchy process is established, and the soft-sensing values of key indexes are derived. The obtained evaluation values of the main measurement index are used as the input variable of the fuzzy least squares support vector machine, and a soft-sensing model of the key index parameters of the new export competitive advantages of China based on the combined soft-sensing model of the fuzzy least squares support vector machine is established. The soft-sensing results of the new export competitive advantage index of China show that the soft measurement model developed herein is of high precision compared with other models, and the technical and brand competitiveness indicators of export products have more significant contributions to the new competitive advantages of China’s export, while the service competitiveness indicator of export products has the least contribution to new competitive advantages of China’s export.

Authors:
Tsang, Y. P., Choy, K .L., Wu, C.H., Ho, G.T.S.     

Abstract:
Effective deployment of the emerging environmental sensor network in environmental mapping has become essential in numerous industrial applications. The essential factors for deployment include cost, coverage, connectivity, airflow of heating, ventilation, and air conditioning, system lifetime, and fault tolerance. In this letter, a three-stage deployment scheme is proposed to formulate the above-mentioned considerations, and the fuzzy temperature window is established to adjust sensor activation times over various ambient temperatures. To optimize the deployment effectively, a multi-response Taguchi-guided k-means clustering is proposed to embed in the genetic algorithm, where an improved set of the initial population is formulated and system parameters are optimized. Therefore, the computational time for repeated deployment is shortened, while the solution convergence can be improved.

Authors:
Ho, G.T.S. , Tsang, Y.P., Wu, C.H., Wong, W.H., Choy, K.L.            

Abstract:
In digital and green city initiatives, smart mobility is a key aspect of developing smart cities and it is important for built-up areas worldwide. Double-parking and busy roadside activities such as frequent loading and unloading of trucks, have a negative impact on traffic situations, especially in cities with high transportation density. Hence, a real-time internet of things (IoT)-based system for surveillance of roadside loading and unloading bays is needed. In this paper, a fully integrated solution is developed by equipping high-definition smart cameras with wireless communication for traffic surveillance. Henceforth, this system is referred to as a computer vision-based roadside occupation surveillance system (CVROSS). Through a vision-based network, real-time roadside traffic images, such as images of loading or unloading activities, are captured automatically. By making use of the collected data, decision support on roadside occupancy and vacancy can be evaluated by means of fuzzy logic and visualized for users, thus enhancing the transparency of roadside activities. The CVROSS was designed and tested in Hong Kong to validate the accuracy of parking-gap estimation and system performance, aiming at facilitating traffic and fleet management for smart mobility.

Authors:
Mo, D. Y., Ng, S. C. H., Tai, David.

Abstract:
This study demonstrates how NetApp, a data storage system provider, used Six Sigma to solve the service parts inventory problem in its multiechelon logistics network, which its inventory management system was unable to fix. The nonstationary demand for service parts created a blind spot for the system, thus hampering NetApp’s contractual commitment to customers of an almost 100% fill rate (FR) for replacing service parts. Constant customer complaints because of FRs that were less than 100% caused NetApp to improve the performance of its service parts replenishment and order fulfillment processes. By following the Six Sigma approach and using the associated qualitative and quantitative tools, the company worked systemically to identify the major causes of insufficient stock and systematically corrected the problem. NetApp formulated a cost-effective inventory solution for its inventory planning system, which resulted in a 10% decrease in the ratio of inventory to revenue and an FR increase from 99.1% to 99.6%. The standard deviation of the replenishment lead time also declined from 4.97 to 1.87 days, implying that the variation of the replenishment lead time was greatly reduced. The Six Sigma process, therefore, provided new insights and a new approach to enable NetApp to manage its inventory planning process.

Authors:
Lo, W. H., Lam, B. S. Y., Cheung , M. F.

Abstract:
This article examines the news framing of the 2017 Hong Kong Chief Executive election using a big data analysis approach. Analyses of intermedia framing of over 370,000 articles and comments are conducted including news published in over 30 Chinese press media, four prominent Chinese online press media, and posts published on three candidates’ Facebook pages within the election period. The study contributes to the literature by examining the rarely discussed role of intermedia news framing, especially the relationship between legacy print media, online alternative news media, and audience comments on candidates’ social network sites. The data analysis provides evidence that audiences’ comments on candidates’ Facebook pages influenced legacy news coverage and online alternative news coverage. However, this study suggests that legacy news media and comments on Facebook do not necessarily have a reciprocal relationship. The implication of the findings and limitations are discussed.

Authors:
Lam, B. S. Y., Choy , S. K.

Abstract:
Different versions of principal component analysis (PCA) have been widely used to extract important information for image recognition and image clustering problems. However, owing to the presence of outliers, this remains challenging. This paper proposes a new PCA methodology based on a novel discovery that the widely used -PCA is equivalent to a two-groups -means clustering model. The projection vector of the -PCA is the vector difference between the two cluster centers estimated by the clustering model. In theory, this vector difference provides inter-cluster information, which is beneficial for distinguishing data objects from different classes. However, the performance of -PCA is not comparable with the state-of-the-art methods. This is because the -PCA can be sensitive to outliers, as the equivalent clustering model is not robust to outliers. To overcome this limitation, we introduce a trimming function to the clustering model and propose a trimmed-clustering based -PCA (TC-PCA). With this trimming set formulation, the TC-PCA is not sensitive to outliers. Besides, we mathematically prove the convergence of the proposed algorithm. Experimental results on image classification and clustering indicate that our proposed method outperforms the current state-of-the-art methods.

Authors:
Xu, L., Tang, M. L., Chen, Z.

Abstract:
In longitudinal data analysis, it is crucial to understand the dynamic of the covariance matrix of repeated measurements and correctly model it in order to achieve efficient estimators of the mean regression parameters. It is well known that any incorrect covariance matrices can result in inefficient estimators of the mean regression parameters. In this article, we propose an empirical likelihood based method which combines the advantages of different dynamic covariance modeling approaches. The effectiveness of the proposed approach is demonstrated by an anesthesiology dataset and some simulation studies.

Ongoing Projects

Principal Investigator: Dr. HO To Sum, George

Abstract: Variable selection procedures aim to identify the correct covariates, which have a significant influence on the outcome variable and could provide robust model prediction. Traditional variable selection procedures such as forward selection procedure, backward elimination procedure, stepwise selection procedure, or model comparison via Bayes factor or some information criterion such as the Akaike information criterion may not be desirable for models with large number of covariates or complex structures. In this project, we particularly develop variable selection procedures for complex data modelling such as high dimensional additive model with interactions under marginality principle and composite quantile regression for ordinal longitudinal data.

Quadratic regression models are natural extensions of linear models by including interaction effects (i.e., cross-product terms) between existing covariates. Interaction effects are important when the effect of one independent variable depends on the value of another independent variable. When the number of covariates is large and variable selection becomes necessary, it is usually recommended that the selected model should follow the marginality principle, i.e., interaction terms can be selected into the model only if their parents (i.e., the associated main effects) are in the model. Additive model generalizes the linear model to high-dimensional and nonlinear model which approximates the mapping function from the covariates to the response using a sum of component functions of each individual covariate. No variable selection procedure has been developed for additive model with interaction terms under the marginality principle.

Traditional linear regression model examines the effect of a set of covariates on the mean of the response variable and has been widely adopted by researchers. Unfortunately, mean regression may loss efficiency when the error distribution is non-normal and its parameter least squares estimates are notoriously sensitive to outliers. On the contrary, quantile regression, which explore the underlying relationship of a particular (conditional) quantile of the response and the multidimensional covariates, yields more robust and efficient estimates. To alleviate the fluctuation of the estimation efficiency associated with the chosen value of the quantile, the composite quantile regression has been recently introduced and the corresponding estimates are more efficient. Variable selection for composite quantile regression for analyzing longitudinal ordinal responses is an attractive alternative to practitioners and has not been developed yet.

In this project, we develop variable selection procedures for (i) additive model with interaction terms under the marginality principle and (ii) composite quantile regression for analyzing longitudinal ordinal responses.

Principal Investigator:
Dr. HO To Sum, George

Abstract:
E-commerce has become an indispensable part of the global retail business, owing to the pandemic’s long-lasting impact on global consumer behaviours. In 2020, 17.8% of overall sales were e-commerce sales, the share is predicted to rise to 24.5% by 2025, marking a 37.6% increase in just five years (Rita & Ramos, 2022). In China, the national online retail sales reached 6,300.7 billion yuan during the first half of 2022, which is an increase of 3.1 percent year-on-year (National Bureau of Statistics of China, 2022). In the current highly competitive business environment, Logistics industry in a Supply Chain play a more critical role in gaining the competitive advantage (Andiyappillai, 2020). To grasp the opportunity of e-commerce, some of the traditional logistics industries have adopted digital transformation (DT), which performed a transition from brick-and-mortar retail models to online for improving competitiveness and profitability. DT is the integration of digital technologies and new business models into all possible areas that enables major business improvements such as value creation for customers and productivity enhancement (Albukhitan, 2020). In logistics industry, applying digital technology provides significant benefits to operations management such as high optimization potential through big data analytics, device and location independent information gathering through cloud computing, low management complexity through decentralized, and better automation through human-machine interaction (Kayikci, 2018). However, the typically widespread DT technologies are ineligible to deal with the challenges of rapid changing business environment such as increasing complexity of operation management (Sanchis et al., 2019). Hence, logistics industry requires an intelligent and flexible solution to enhance the resilience capacity for overcome the issues of dynamically changing environment.

Low-code development platform (LCDP), a trendy mechanism to facilitate the rapid development of software applications (apps), would be eligible to support and facilitate resilient digital transformation. LCDP are provided on the cloud that enabling the development of fully functional apps via advanced graphical user interfaces and visual abstractions, all with minimal or no procedural code (Sahay et al., 2020). By using LCDP, users with no particular programming background can build their apps for conducting the activities or tasks in business operation, without the help of several developers. Thus, developers can focus on valuable work such as improve business logic of the application rather than dealing with unnecessary details related to setting up of the needed infrastructures, managing data integrity across different environments, or enhancing the robustness of the system. Additionally, adoption of LCDP would enhance flexibility and agility, speed up development time, enable quick response to market demands, reduce bug-fixing, lower deployment effort, and make maintenance easier (Al Alamin et al., 2021), and hence improve resilience capacity of logistics industries.

With LCDP, traditional logistics industries can convert the data storage model from handwriting documents to digital, which prevent significant data loss and create the opportunity of data analysis. To perform data analysis, Artificial intelligence (AI) would be feasible technology that help industries to make fast and smart decisions to reacting to external change, so as to enhance the resilience capacity. AI would be defined as the engineering of intelligent machines with a special focus on intelligent computer programs that can operate without human intervention (Woschank et al., 2020). By employing big data analytics, such as the fuzzy association rule mining (FARM) technique, AI can discover hidden relationships to make decisions, and to reach different conclusions based on the analysis of different situations. Consequently, logistics industry not only can make real-time adjustment of the business operation to ensure operational efficiency in a timely manner but also provides a stable mechanism for long-term quality enhancement.

Principal Investigator:
Dr. Valerie Tang

Abstract:
In this project, an AI-based intelligent model is proposed to facilitate the warehouse operation and enhance its performance. Two modules namely digital workforce module (DWM) and AI-based analysis module (AIAM) are involved in the proposed model. In the first module (DWM), robotic process automation (RPA) and Manufacturing Execution System (MES) are integrated to automatically collect useful data in the manufacturing process. Then, the AIAM will utilize AI to extract the data collected from DWM as the input to perform analysis of the manufacturing process using fuzzy association rules mining (FARM) for predicting the possible demand of materials and possible outputs of finished goods. As a result, the proposed model would facilitate digital transformation in traditional warehouses enabling relevant data to be digitalized to perform further analysis and visualization. Moreover, digital workforce is adopted for reducing manual mistakes and increasing operational efficiency. The AI-based intelligent model provides a better level of decision support ability for improving the quality of overall warehouse management. By doing so, traditional warehouses can accomplish error free operation and efficiency enhancement under the e-commerce environment with its existing capacity.
Three objectives are defined in this research project which are:
• To study the existing warehouse operations and manufacturing process under the e-commerce environment.
• To integrate RPA with MES for digital transformation in warehouse operations and manufacturing
processes so as to reduce human error and resources wastes.
• To design and develop the architecture of the AI-based intelligent model for warehouse operation analysis
to achieve performance enhancement.

Principal Investigator:
Dr. Mo Yiu-Wing

Abstract:
Managing the dual channel of logistics resources has become more critical than ever, not only to achieve cost savings via enhanced process efficiency in operations, but also to utilise idle resources within and outside operations for social sustainability. With the success of crowdsourcing logistics platforms in recent years, many companies have sought to outsource some of their logistics orders to crowd networks. However, when compared with internal logistics resources, resources in the crowd network involve higher uncertainty, which creates many challenges for companies in determining the allocation of logistics orders to the crowdsourced platform for the fulfilment of ad-hoc demand. There is a lack of a holistic approach for integrating internal logistics resources among various storage facilities with crowdsourced vehicles via decision intelligence systems.

In this research project, we aim to design an integrated decision framework for managing the dual channel of logistics resources through the adoption of decision intelligence systems. With the support of decision intelligence systems, including systems simulation, data-driven models, and geospatial data analytics, the integration of internal and crowd logistics resources is expected to lead logistics operations to the next stage of operations management. The main contributions of this study are therefore focused on the management theory of dual channel logistics resource management. Apart from the management theory, we will collaborate with a company in this project. The collaborated case study would also serve as a guideline for practitioners.

Principal Investigator:
Dr. HO To Sum, George

Abstract:
Due to the outbreak of COVID-19, customer behaviour around the world looks completely different today than it did even one year ago. For example, retail sales via e-commerce channels in both the United States and European Union recorded rapid growth (i.e., 15% and 30% respectively) in 2020, while the gross value of retail sales was in decline (OECD, 2020). The same trend could also be observed in Hong Kong. Total retail sales in Hong Kong recorded 11 consecutive months of decline in 2020 (Census and Statistics Department, HKSAR, 2020); by contrast, the value of individual customer purchases via ecommerce platforms doubled in 2020 (Hong Kong Television Network Limited, 2020b). The changes in customer behaviour and the burgeoning of ecommerce purchasing indicated shrinking sales at physical stores and the emergence of the ‘next normal’: B2C e-commerce business. Amidst these changes, the value chain of the retail industry may be reconfigured and Logistics Service Providers (LSPs) are urged to transform their routine operations (i.e., orders placed by wholesalers or retailers) into a sound e-fulfilment process (i.e., orders placed by individual customers via e-commerce) with effective strategies. Recently, research studies on e-commerce industry have focused on improving operational effectiveness and efficiency, warehouse layout optimization, and last-mile delivery (Ranieri et al., 2018; Farooq et al., 2019). Considering the needs of next-day or even same-day deliveries as an e-fulfilment process, ensuring a fast and efficient retrieval of Stock Keeping Units (SKUs) from shelves, has become crucial for today’s LSPs. To meet the trends of the ‘next’ efulfilment ‘normal’, LSPs need to be transformed with additional capabilities for handling discrete and fluctuating e-order demands. However, most LSPs in Hong Kong, especially small and medium (SME) -type LSPs, use rented warehouses to provide their services. They are unable to afford the large investments that would be entailed in adopting an automated storage and retrieval system and a sophisticated order picking system in rented warehouses; this limits their competencies and capabilities in handling e-orders. Therefore, research and development on effective e-fulfilment decision strategies regarding inventory replenishment and operational optimization is needed for enhancing and streamlining e-fulfilment operations. This project aims to design and develop a Federated Learning-based e-fulfilment decision model for overcoming the new challenges presented to the logistics industry by today’s B2C e-commerce business in the wake of the COVID-19 pandemic. This system integrates collaborative machine learning and operational decision modelling to facilitate the transformation from traditional warehouses to e-fulfilment centres. From the perspective of LSPs, the proposed model allows them to generate the optimal pick face replenishment strategy and fully utilize resources for handling the fluctuating demands of eorders without needing to re-construct the whole premises and infrastructure. Considering the limited datasets obtained by SME-type LSPs, this project also contributes to establish an industry-wide solution for estimating quantity per SKU to be held in the pick face area. Through streamlined put-away and order picking in e-fulfilment operations, customer e-orders can be effectively fulfilled by the logistics warehouses, enhancing their online shopping experience. With the aid of the proposed decision model, the capabilities of the e-fulfilment process are enabled for LSPs, resulting in better competitiveness and service coverage when the ‘next normal’ emerges in the B2C e-commerce market.

Principal Investigator:
Dr. Cathy LAM

Abstract:
Due to the rapidly growing demand for reliable and high-quality cold chain logistics following the global pandemic, increasing concern has led to the development of a robust and comprehensive cold chain logistics system, in order to meet designated handling requirements and specifications. Different from general logistics services, time-temperature-sensitive products, such as pharmaceuticals and life sciences products, need to be refrigerated at extremely low temperatures during transportation and distributed within a short time period within the cold chain. Concerning the strict handling requirement of such time-temperature-sensitive products, appropriate cold chain packaging methods, monitoring devices and shipment routes must be specially designed by the Cold Chain Logistics Service Providers. Currently, most of the passive packaging materials and monitoring devices are designed for one-time consumption, such that the cost of reverse logistics in the supply chain network, can be eliminated. However, after receiving the pharmaceuticals and life science products, the downstream supply chain partners would simply dispose of the packaging materials. This results in poor sustainable development regarding cold chain logistics. Consequently, a certain amount of solid waste is created each time goods are received, with a significant environmental impact on society. Hence, this research proposes a digital twin-based closed-loop logistics decision model for handling time-temperature-sensitive shipments. The result of this project will reshape the cold chain in the digital age, benefit society in terms of sustainability and environmental impact and hence contribute to cold chain logistics development in Hong Kong.

Principal Investigator:
Dr David Chan

Abstract:
The Internet of Things (IoT) is an emerging wireless communications and networking technology that can be utilized to connect billions of devices and establish a close connection between our physical world and computer networks. Many time-critical applications, such as autonomous vehicles and industrial control, require the support of ultra-reliable low-latency communications (URLLC) to convey fresh information updates. However, information freshness cannot be accurately quantified by traditional metrics such as throughput and delay. Therefore, the age of information (AoI) metric has recently received extensive attention from researchers. AoI is defined as the elapsed time since the most recently received packet was generated. Literature shows that replacing traditional performance metrics with AoI may lead to fundamental changes in the communication system designs.

Most AoI research has focused on the upper layers of communication networks. Lower-layer solutions, such as multiple access schemes for the medium access control (MAC) layer and multi-user interference cancellation schemes for the physical (PHY) layer, have not been thoroughly studied for their impact on information freshness. Existing lower-layer designs cannot guarantee good information freshness when a large number of users access complicated and unreliable wireless channels. This problem seriously hinders the development of time-critical IoT applications. Moreover, information update packets in the IoT networks are usually very short. Shannon’s channel capacity formula in information theory assumes an infinite blocklength and is therefore not suitable for characterizing the performance of short-packet communications.

The purpose of this project is to fill the above-mentioned research gaps. To begin with, we would like to develop a theoretical framework for AoI analyses in various error-prone short-packet wireless communication models. Based on the developed framework, we then design lower-layer algorithms to enhance information freshness by physical-layer network coding (PNC) and non-orthogonal multiple access (NOMA). PNC alleviates the multi-user interference problem by utilizing the network-coded packets decoded from superimposed signals. NOMA improves spectral efficiency by serving multiple users at the same time and frequency. Our preliminary simulations show that PNC and NOMA can significantly improve the AoI performance of many channel models. To the end, we would investigate the combination of PNC and NOMA to improve the AoI performance further. If this research achieves favorable outcomes, it will be a solid step in the theory and practice of enhancing information freshness in the next-generation IoT networks.

Principal Investigator:
Dr. Wu Chun-Ho Jack

Abstract:
Digital technologies not only automate financial processes, but they also re-structure the communication channels, develop new business models, identify new markets, etc. This research aims to investigate the affordances and actualisations of recommendation systems based on blockchain technology for insurance and financial products. To analyse the blockchain-based recommendation systems, the research objectives include to identify the critical moment of health-conscious people purchasing an insurance and/or financial product; to characterise the insurance and/or financial products to be recommended, and to extract association rules to enhance the recommendation process.

Principal Investigator:
Dr. Choy Siu-Kai

Abstract:
Image segmentation is a challenging problem in computer vision and has a wide variety of applications in various fields such as pattern recognition and medical imaging. One of the main approaches to this problem is to perform superpixel segmentation followed by a graph-based methodology to achieve image segmentation. Crucial to the successful image segmentation using this method is the superpixel generation algorithm and superpixel partitioning algorithm. Existing superpixel generation algorithms have various priorities and place emphasis on boundary adherence, superpixel regularity, computational complexity, etc, but normally do not perform well in all of the above simultaneously. Superpixel partitioning algorithms are typically based on graph-based approaches and could have high computational costs, which makes them inefficient in practical contexts. In the proposed project, we will investigate a fast and effective unsupervised fuzzy superpixel-based image segmentation algorithm to remedy the aforementioned difficulties for a wide range of applications. In particular, we will study the combined use of a novel fuzzy clustering-based superpixel generation technique and fuzzy graph-theoretic superpixel partitioning approach for image segmentation applications. The proposed segmentation method will be assessed by extensive comparative experiments using complex natural and textural images.

Completed Projects

Principal Investigator:
Dr. Ho To-Sum

Abstract:
The blooming of e-commerce in the past decade has not only brought significant economic growth to the e-retailers, but also new opportunities and challenges to the logistics industry. To seize the opportunities arising from the emerging e-commerce logistics in Hong Kong, logistics service providers (LSPs) are forced to take on new roles and adjust their operations to fulfill the dynamic customer demand. This research aims to develop a Blockchain-based E-Commerce Analytics Model, integrating blockchain technology and the machine learning algorithm for managing data across the supply chains and predicting dynamic e-commerce order demand.

This research enables industry practitioners, especially LSPs and e-retailers, to plan ahead for the subsequent e-commerce operations. From the perspective of LSPs, the prediction model allows the firm to realize the e-commerce order arrival patterns, enabling flexible re-allocation of the right amount of resources in real time to deal with the hour-to-hour fluctuating arrival of orders in distribution centers. From the perspective of a retailer, the generic prediction model allows the firm to predict, for example, the sales volume among various e-commerce sales channels, the sales volume from different customer segments, and the e-commerce sales performance of different product categories. By tackling the unpredictability of demand in the e-commerce business environment, this research contributes to an effective decision support strategy for logistics `operations planning, hence, enhancing e-commerce logistics competence in Hong Kong.

Principal Investigator:
Dr. Ng Chi-Hung Stephen

Abstract:
The sales volume of e-commerce experienced a rapid growth after the outbreak of COVID-19. Considering that the demands are unpredictable and most of orders are small-sized, this increases the challenges for small and medium (SME) -type companies in handling e-orders in terms of order management, data analysis, demand forecast, and inventory optimization. Digital transformation could be the strategy adopted by companies to transfer their traditional warehouse under the e-commerce new normal. However, according to a survey, about half of SMEs in Hong Kong did not understand how to adopt digital technology and hesitated to perform digital transformation as they believed that could be complex and expensive. Therefore, this research proposes to design a new AI-based intelligent model by integrating the digital technologies and artificial intelligent-based predictive analytics for companies to achieve performance enhancement. With the proposed model, many routine processes could be performed automatically, and human staff would be freed from repetitive tasks to focus on more innovative, value-added, and serviced related jobs. Also, error free operation and efficiency enhancement could be achieved in traditional warehouses which can ultimately facilitates digital transformation.

Principal Investigator:
Dr. Mo Yiu-Wing

Abstract:
Given the current ageing population and limited social welfare expenditure, scholars are renewing their interests in how community organisations can operate to sustainably serve various needs of people with travel inconvenience in society. This research aims to design flexible vehicle management systems that enhance the management of various paratransit services through better system design and optimisation of vehicle resources.

The study scope of paratransit services includes schedule route, dial-a-ride, feeder and pooled dial-a-ride. Users who require those services have different expectations for travelling times, prices, service frequencies as well as pick-up and drop-off locations. This variety of service requirements poses numerous new challenges for community organisations to sustain paratransit services. Hence, it is essential to innovate options for a holistic approach to coordinating various types of service in a common sharing platform, which meet people’s diverse needs in a more efficient way. We expect the outcomes of this research would support the policy review and operational improvements for community organisations.

Principal Investigator: Dr WU Chun-ho

Abstract: Most recent studies on logistics and supply chain management have focused on improvements to operational efficiency, information management, and network. Pallet management is a crucial yet less-researched aspect of the logistics industry. Currently, the closed-loop network for pallet management is preferred in the logistics industry, as positive environmental and economic impacts can be obtained. However, logistics networks are relatively complex and difficult to manage, due to the presence of the reverse logistics process. Therefore, a blockchain-enabled IoT system for pallet-pooling management is proposed in this project. This system integrates the development of blockchain and IoT technologies to identify, control, and monitor pallets in a closed-loop logistics network. Consequently, pallet standardisation can be established in the Guangdong-Hong Kong-Macao greater bay area, while the efficiency of logistics operations can be further enhanced.

Principal Investigator:
Dr. HO To Sum, George

Abstract:
The next era of information and communication technology, namely digital transformation, attracts considerable attention from industrial practitioners to make unprecedented changes for businesses. In Hong Kong, over 340,000 small and medium enterprises (SMEs) are contributing more than 98% of the total business units with 45% of total employment. However, they have always faced challenges regarding resource shortages, lack of talent and poor performance measurement. In business process management, most tasks in SMEs are still labor-intensive without effective technological support or automation.

To improve the above situation this project proposes the Smart Robotic Workforce System, which is a digital twin-based solution for business process re-engineering. The business processes in the physical world are transformed onto the digital world using Internet of Things and physical internet technologies. To facilitate the deployment of robotic process automation (RPA) in SMEs, artificial intelligence techniques are utilized as an intelligent agent to learn from case scenarios and to recommend appropriate RPA formulations. In addition, data mining and multi-criteria decision-making approaches are integrated to measure performance and resource allocation of process robots in SMEs. Through automating business processes, a driving force for re-industrialization and a growth engine for a future economy can be achieved.

Principal Investigator:
Dr. WU Chun-ho

Abstract:
There are different learning approaches for students to improve their English proficiency, such as providing online English materials and consultation for students at the University. However, most of the approaches are lacking systematic approaches to monitor and assess training and improving the path of students. Teachers may only rely on assignments and examination results to evaluate student performance in a particular subject. In such situations, it is difficult for teachers to identify problems of an individual student in speaking English and hence provide suggestions for students to prevent making repetitive mistakes. In such a situation, students may still use “Engrish” to learn other subject domain knowledge and result in poor subject performance. Based on the above problems, this project aims to develop an interactive artificial intelligence assisted chatbox in terms of mobile App for our students to learn and self-improve English proficiency effectively. Through assessment reports generated from the mobile App, teachers can effectively monitor students’ training and improving progress in learning English.

Principal Investigator:
Dr. Lam Shu-Yan

Abstract:
Supervised learning problems infer functions from labelled training data. Learning lower dimensional subspaces in supervised learning problems is important in applications such as human action recognition, face recognition and object recognition. Dimensionality reduction is performed to remove noise from the data and simplify data analysis. Linear Discriminant Analysis (LDA) and its variants have been shown to be suitable for handling data structures in linear, quadratic and highly non-linear forms. However, conventional LDA formulations suffer from two major limitations. Firstly, they use arithmetic means to represent the class centroids of the input data. However, the arithmetic mean has been shown to not effectively represent these data, especially with data that contains heavy noise and outliers. Secondly, it is difficult to show statistically that the learnt projection vectors are effective in the presence of heavy noise and outliers. Hence, conventional LDA fails to determine the most representative features from the input data.

In the proposed project, we aim to develop a new class of dimensionality reduction techniques for labelled data that can overcome the major limitations of conventional LDA techniques. The core idea is to formulate the dimensionality reduction problem as a set of clustering problems. The novelty of the proposed approach is that unsupervised clustering problems can effectively learn the subspace of the supervised learning problem. Locating effective centroids has been well-studied in clustering research. Furthermore, well-developed theories can be used to analyse the sensitivities of these methods in the presence of heavy noise and outliers. If successful, the proposed study will significantly increase the performance of dimensionality reduction for labelled data using clustering, which will fundamentally improve the way in which useful information can be extracted in many real-world applications.

Principal Investigator:
Dr. Choy Siu-Kai

Abstract:
Image segmentation is a critical problem in computer vision for a wide variety of applications. Among the existing approaches, partial differential equations and variational methods have been extensively studied in the literature. Although most variational approaches use boundary and region information to segment natural and textural images with remarkable success, we note that most of the existing methods only consider simple information/features extracted from a particular image domain (e.g., grey level features in the spatial domain) to characterise image regions. However, such information/features are not informative enough to segment complex images. In the proposed project, we will investigate a robust and effective variational segmentation algorithm to remedy the aforementioned difficulties for a wide range of applications. In particular, we will study a mathematical optimisation framework that integrates the bit-plane-dependence probability models, which are used to characterise local region information extracted from various image domains, with the fuzzy region competition for image segmentation. We will also study the mathematical theory for the segmentation algorithm. The proposed segmentation method will be assessed by extensive and comparative experiments using complex natural and textural images.

Principal Investigator:
Prof. Tang Man-Lai

Abstract:
High dimensional data analysis has become increasingly frequent and important in diverse fields; for example, genomics, health sciences, economics and machine learning. Model selection plays a pivotal role in contemporary scientific discoveries. There have been a large body of works on model selection for complete data. However, complete data are often not available for every subject due to many reasons, including the unavailability of covariate measurements and loss of data. The literature on model selection for high dimensional data in the presence of missing or incomplete values is relatively sparse. Therefore, efficient methods and algorithms for model selection with incomplete data are of great research interest and practical demand.

For model selection, the information criteria (e.g., the Akaike information criterion and the Bayesian information criterion) is commonly applied, and it can be easily incorporated with the famous EM algorithm in the presence of missing values. Generalized EM algorithm has also been developed to update the model and the parameter under the model in each iteration. It performs Expectation step and Model Selection Step alternately, converges globally, and yields a consistent model in model selection. However, it may not always be numerically feasible to perform Model Selection Step, especially for high dimensional data. Therefore, a new method for model selection with high dimensional incomplete data is greatly desirable. Our proposed algorithm in this project will hopefully yield a consistent model in general missing data patterns and have numerical convergences. Moreover, our proposed method is expected to perform efficiently variable selection in linear regression, generalized linear models and model selection of graphical models.

Due to the convenience of its implementation by using standard software modules, multiple imputation is arguably the most widely used approach for handling missing data. It is straightforward to apply an existing model selection method to each imputed dataset. However, it is challenging to combine results on model selection across imputed data sets in a principled framework. To overcome the challenge, many advanced techniques are developed for variable selection problem, such as the group lasso penalty to merged data sets of all imputations, the strategy of stability selection within bootstrap imputation, and random lasso combined with multiple imputation. These techniques are feasible for high-dimensional data with complex missing patterns and have achieved good performance in simulation studies and real data analyses. However, as far as we know, it is very surprising that there is no imputation method for graphical models. An imputation-based method for graphical model selection is greatly desirable. In this project, we investigate bootstrap multiple imputation with stability selection. We expect the proposed method can deal with general missing data patterns.

Principal Investigator:
Dr. Wong Siu-kuen Ricky

Abstract:
Past studies on negotiation strategy have emphasised the benefits of different compliance techniques, for example, door-in-the-face, foot-in-the-door, the low-balling techniques, anchoring effect, etc. A growing body of research has shown how negotiators using compliance tactic may obtain better negotiated outcomes. Undoubtedly, the use of these tactics is beneficial when there involves only a one-off negotiation. Now we have seen that many opportunities for negotiation training are available at universities and corporate training courses. And, in a real-life setting, it is often that negotiators involve in repeated negotiation. Coupling this with people’s knowledge in negotiation tactics, it is contentious that the use of compliance tactic is beneficial in the longer run. The adverse effects of compliance tactic have been neglected in research on negotiation. A more thorough understanding of the potential costs resulting from the use of compliance tactics is important for negotiators or practitioners to make an informed decision.

Principal Investigator:
Dr. Mo Yiu-Wing

Abstract:
With the advanced logistics developments in recent decades, various manufacturers are able to profit from the spare parts service for systems maintenance and to enhance product sustainability by managing the express delivery and the reverse logistics. These advanced logistics developments have driven the evolution of traditional spare parts management into a new service model. Apart from the on-site spare parts management, manufacturers and authorised service providers must offer more customised services and the collection of repairable items from users in the reverse logistics process. However, these evolutionary service requirements introduce procedural complexities and extends the service scope.

In this research, we aim to optimise the process of service parts management through a holistic and adaptive approach. The whole process scope includes logistics network design, inventory and warehouse management, and reverse logistics operations. To identify the numerous factors and parameters during the process optimisation, we will start by standardising a generic process flow of service parts operations that align with companies’ strategic objectives. Then, we will perform data collection to investigate the effects of these factors and their correlations. After identifying the critical factors, we will formulate them into a generic decision model for deriving optimal adaptive policies with a data-driven process control mechanism. A simulation platform will be developed to verify and monitor the proposed solutions. The performance of the optimal adaptive policies will be finally benchmarked with the optimal static policy, which is commonly applied in various industries. These results will provide effective guidelines for the implementation of adaptive process optimisation of service parts operations.

Principal Investigator:
Dr. Liu Hai

Abstract:
Robotics technologies are advancing rapidly. Groups of robots have been developed that can communicate with each other using wireless transmissions and form robot swarms. Applications of these swarms include surveillance, search and rescue, mining, agricultural foraging, autonomous military units and distributed sensing in micromachinery or human bodies. For example, swarm robots can be sent into the places that are too dangerous for human workers and detect life-signs via infrared sensors. In all of these applications, a group of self-propelled robots move in a cohesive way (i.e., connectivity is preserved during these movements). Such behavior is usually referred to as collective motion. This research aims to design self-adaptive collective motion algorithms for swarm robots in 3D space. The algorithms are expected to be self-adaptive in the sense that robots will be able to dynamically determine proper moving parameters, based on their environments and statuses. Using the proposed collective motion algorithms, robots will be able to move along a pre-planned path from a source to a destination while satisfying the following requirements. 1) The robots will use only one-hop neighbor information. 2) The robots will maintain connectivity of the network topology for information exchange. 3) The robots will maintain a desired neighboring distance. 4) The robots will be capable of bypassing obstacles without partitioning the robot swarm (i.e., member loss). We will develop collective motion algorithms for the following three cases: 1) no obstacles and no leader, 2) no obstacles with a leader; and 3) with obstacles (with and without a leader). We will conduct extensive experiments in testbed robots to examine the performance of the algorithms in practical applications.

Principal Investigator:
Prof. Tang Man-Lai

Abstract:
One of the most important challenges in modern survey measurement is the elicitation of truthful answers to sensitive questions about behavior and attitudes (e.g., abortion, illegal drug use and racial prejudice). It has long been well known that accessing information regarding a sensitive characteristic in a population usually induces two notorious issues, namely non-response bias (i.e., respondents refuse to collaborate in the fear of the protection of their confidentiality) and response bias (e.g., respondents answer the sensitive questions but give false answers), which usually induce estimate’s efficiency loss, inflated sampling variance, and biased estimates. Therefore, techniques that guarantee anonymity, minimize the respondents’ feelings of jeopardy, and encourage honest answers are of great demand. In this project, we propose several practical generalizations for the famous item count techniques for sensitive survey questions.

Poisson ICT has recently been developed to overcome the shortages of the conventional item count techniques (ICTs) by replacing the list of independent innocuous (binary) statements by a single innocuous (Poisson) statement. Despite various attractive advantages, Poisson ICT still possesses some limitations. First, it is assumed that respondents will comply with the survey design. Second, it is assumed that the outcome of the innocuous statement follows the less practical Poisson distribution. Third, no regression model has been developed for binary sensitive outcomes.

In this proposal, we plan to

(i) (New Poisson ICT with Non-Compliance) Develop a new Poisson ICT that takes the non-compliance from the respondents into consideration;

(ii) (New Inflated-Zero Poisson ICT) Develop a new Poisson-type ICT that allows the outcome of the innocuous statement follows the more realistic inflated-zero Poisson distribution; and

(iii) (Regression Modeling with Sensitive Outcome) Develop a regression model for binary sensitive outcomes.

Upcoming Projects