Ahmad Rahdari, Elham Keshavarz, Ehsan Nowroozi
et al.
The increasing need to process large, high-dimensional datasets and the substantial computational power required have made the use of distributed cloud servers essential. These servers provide cost-effective solutions that make storage and computing accessible to ordinary users. However, they might face significant vulnerabilities, including data leakage, metadata spoofing, insecure programming interfaces, malicious insiders, and denial of service. To gain public trust in distributed computing, addressing concerns related to privacy and security while ensuring high performance and efficiency is crucial. Multiparty computation, differential privacy, trusted execution environments, and federated learning are the four major approaches developed to address these issues. This survey paper reviews and compares these four approaches based on a structured framework, by highlighting recent top-tier research papers published in prestigious journals and conferences. Particular attention is given to progress in federated learning, which trains a model across multiple devices without sharing the actual data, keeping data private and secure. The survey also highlights federated learning techniques, including secure federated learning, by detecting malicious updates and privacy-preserving federated learning via data encryption, data perturbation, and anonymization, as new paradigms for building responsible computing systems. Finally, the survey discusses future research directions for connecting academic innovations with real-world industrial applications.
Telecommunication, Transportation and communications
IntroductionObjectively predicting speech intelligibility is important in both telecommunication and human-machine interaction systems. The classic method relies on signal-to-noise ratios (SNR) to successfully predict speech intelligibility. One exception is clear speech, in which a talker intentionally articulates as if speaking to someone who has hearing loss or is from a different language background. As a result, at the same SNR, clear speech produces higher intelligibility than conversational speech. Despite numerous efforts, no objective metric can successfully predict the clear speech benefit at the sentence level.MethodsWe proposed a Syllable-Rate-Adjusted-Modulation (SRAM) index to predict the intelligibility of clear and conversational speech. The SRAM used as short as 1 s speech and estimated its modulation power above the syllable rate. We compared SRAM with three reference metrics: envelope-regression-based speech transmission index (ER-STI), hearing-aid speech perception index version 2 (HASPI-v2) and short-time objective intelligibility (STOI), and five automatic speech recognition systems: Amazon Transcribe, Microsoft Azure Speech-To-Text, Google Speech-To-Text, wav2vec2 and Whisper.ResultsSRAM outperformed the three reference metrics (ER-STI, HASPI-v2 and STOI) and the five automatic speech recognition systems. Additionally, we demonstrated the important role of syllable rate in predicting speech intelligibility by comparing SRAM with the total modulation power (TMP) that was not adjusted by the syllable rate.DiscussionSRAM can potentially help understand the characteristics of clear speech, screen speech materials with high intelligibility, and convert conversational speech into clear speech.
Objectives. The service level agreement is an important tool used in building reasonable relations between subscribers and operators of telecommunication networks. This includes the quality of services provided. One key component is reliability as assessed by the availability factor. The most suitable model for assessing the reliability of the service provided is a random graph model based on the service contour. This is the set of technical resources involved in the provision of this service. In this formulation, the assessment of the reliability of the service is based on the reliability of elements which constitute the telecommunications network (graph), nodes (vertices) and communication lines (edges). At the same time, the availability factors of nodes and lines are determined by the design features of the distribution environment, as well as the technical means used to organize them. The purpose of this work is to develop an approach to analyzing the reliability of telecommunication networks which support protective switching mechanisms for one protected and one backup sections.Methods. The following methods are used: theory of random graphs, matrices, probabilities and computer modeling.Results. The elements of the route, both basic and reserving, are divided into three groups. The first indicates permanent unchangeable parts of the paths, the second group identifies the reserved sections, and the third group indicates the reserving sections. At the same time, each of the reserved and reserving sections is formed on the basis of specified preferences. They are usually aimed at increasing the resulting reliability, although other rules may be used. In the case of protective switching schemes for one protected section and one backup sections, a variant of forming routes used for further calculations of the reliability indicator is shown.Conclusions. Using the example of a backbone network, the study shows that the use of protective switching mechanisms for the case of one required transmission route demonstrates a significant increase in reliability, with the exception of the use of protective switching in sections. This is primarily due to the topology features of the network under consideration.
Abstract In radar target detection and tracking tasks, the detection algorithm and data association algorithm are the primary technologies. The accuracy of detection, stability of tracking and processing speed are the key points to accomplishing effective high‐resolution range profile (HRRP) multi‐target tracking (MTT). Classic HRRP target tracking algorithms conduct the detection by extracting handcraft features and conduct tracking based on the data association algorithm. With the development of neural network methods, deep learning methods have been widely applied in target detection and tracking. An HRRP detection and tracking (HDT) network based on neural networks is proposed, which includes three steps: Firstly, the HRRP signal is processed by a convolutional neural network detector to extract the feature of the origin signal and generate the detection measurement. Secondly, a predicted measurement is generated by a Kalman filter, which estimates the current state of the target based on its previous states and the detection measurement. Finally, the ReID network calculates the cosine similarity between the tracks and measurements to associate them and a linear sum assignment operation is used to match the tracks and measurements. By using the proposed algorithm, the HDT network is capable of detecting and tracking multiple targets in complex cluttered environments with high accuracy. Experimental results on an HRRP dataset collected by a moving radar platform show the outstanding performance of the HDT network, including detection accuracy, track integrity and real‐time MTT.
This milestone work overcomes the lower bandwidth limitation of Erbium Doped Fiber Laser (EDFRL) by inserting a Moiré Grating in the laser loop, with closely spaced wavelengths so that beating of wavelengths produces a very high bandwidth pulsed chaos. The chaos bandwidth variation is studied with respect to three parameters i.e., wavelength spacing, fiber Bragg grating bandwidth and fiber nonlinearity. The high bandwidth chaos can be produced with rich and flatter spectral content by controlling the said parameters in a narrow range studied in this work. The chaotic pulses become narrow in pulse width, closely spaced in time and more dynamic in amplitude. The Lyapunov Exponent increases as the chaos becomes more unpredictable. The EDRFL, known for more control parameters can be deployed in higher bandwidth chaos generation applications besides semiconductor laser.
The birth of satellite Internet brings new development opportunities, but also many challenges.How artificial intelligence, as an important auxiliary tool, was widely used in the field of satellite communication/satellite Internet in the context of the development of space-air-ground integration, was investigated, which involved communication anti-jamming, communication routing, satellite-terrestrial network system architecture, constellation operation and management and other scenarios.The AI algorithms included traditional machine learning, deep learning, reinforcement learning and so on.Finally, by taking the development trend of the AI applied in the satellite field into consideration, several future research directions were put forward, which provided new ideas and technical solutions for the intelligent development of satellite field in our country.
The power system is in a transition towards a more intelligent, flexible and interactive system with higher penetration of renewable energy generation, load forecasting, especially short-term load forecasting for individual electric customers plays an increasingly essential role in future grid planning and operation.A big data framework for short-term power load forcasting using heterogenous was proposed, which collected the data from smart meters and weather forecast, pre-processed and loaded it into a NoSQL database that was capable to store and further processing large volumes of heterogeneous data.Then, a long short-term memory (LSTM) recurrent neural network was designed and implemented to determine the load profiles and forecast the electricity consumption for the residential community for the next 24 hours.The proposed framework was tested with a publicly available smart meter dataset of a residential community, of which LSTM’s performance was compared with two benchmark algorithms in terms of root mean square error and mean absolute percentage error, and its validity has been verified.
Aiming at the problems of the unreasonable structure and the low efficiency of the traditional statistical partition and publishing of location big data, a deep learning-based statistical partition structure prediction method and a differential publishing method were proposed to enhance the efficacy of the partition algorithm and improve the availability of the published location big data.Firstly, the two-dimensional space was intelligently partitioned and merged from the bottom to the top to construct a reasonable partition structure.Subsequently, the partition structure matrices were organized as a three-dimensional spatio-temporal sequence, and the spatio-temporal characteristics were extracted via the deep learning model in a bid to realize the prediction of the partition structure.Finally, the differential privacy budget allocation and Laplace noise addition were implemented on the prediction partition structure to realize the privacy protection of the statistical partition and publishing of location big data.Experimental comparison of the real location big data sets proves the advantages of the proposed method in improving the querying accuracy of the published location big data and the execution efficiency of the publishing algorithm.
Based on the research results of Monte-Carlo simulation technology,the service-aware air port rate guarantee threshold was established.Based on the research of Monte-Carlo simulation technology,the design method of user distribution model and business model in perceptive simulation planning was obtained,and on the basis of this,the specific simulation planning analysis ideas was given.Taking the actual planning of a certain area as an example,the method of perceived planning based on simulation was feasible and effective,and it had good promotion value by comparing the business perceived rate and service success rate before and after the planning.
Printed media is still popular now days society. Unfortunately, such media encountered several drawbacks. For example, this type of media consumes large storage that impact in high maintenance cost. To keep printed information more efficient and long-lasting, people usually convert it into digital format. In this paper, we built Optical Character Recognition (OCR) system to enable automatic conversion the image containing the sentence in Latin characters into digital text-shaped information. This system consists of several interrelated stages including preprocessing, segmentation, feature extraction, classifier, model and recognition. In preprocessing, the median filter is used to clarify the image from noise and the Otsu’s function is used to binarize the image. It followed by character segmentation using connected component labeling. Artificial neural network (ANN) is used for feature extraction to recognize the character. The result shows that this system enable to recognize the characters in the image whose success rate is influenced by the training of the system.
Background. Despite the popularity of the model of self-similar traffic, until now a number of tasks of assessing the quality of service in the packet communication network remains unresolved. Because of the lack of a rigorous theoretical base that can complement the classical queuing theory when designing a packet-based communication network with self-similar traffic, there is no reliable and recognized methodology for calculating parameters and quality indicators for information distribution systems under conditions of the self-similarity effect.
Objective. The aim of the paper is the improvement of the accuracy of calculating the quality of service characteristics by obtaining a new formula for calculating the traffic self-similarity coefficient, depending on the parameter of the form of the Weibull or Pareto distributions. Self-similar traffic or the time interval between stream packets is described by these distributions.
Methods. To calculate the QoS characteristics, you only need to know the parameter a of the Weibull or Pareto distribution form and there is no need to calculate in a rather complicated way, for example, the R/S-method, the self-similarity coefficient of Hurst for traffic.
Results. A significant difference between the real and the linear dependence of the self-similarity coefficient H on the parameter a of the Weibull distribution form or on the parameter a of the Pareto distribution form is detected.
Conclusions. The use of real functional dependences of H on a allows enhancing the accuracy of calculating the quality of service characteristics.
Keywords: quality of service: Hurst coefficient: self-similar traffic.
The current power analysis attack of HMAC based on SM3 applies only to the object,on which there is the Hamming weight and Hamming distance information leakage at the same time.there is only a single information leakage mode on the attack object,then the attack methods don't work.To solve the limitations of the current attack methods,a novel method of the power analysis attack of HMAC based on SM3 was proposed.The different attack object and their related va-riables were selected in each power analysis attack.The attacks were implemented according to the Hamming distance mod-el or Hamming weight model of the intermediate variables.After several power analysis attacked on the first four rounds of SM3,the equations that consists of the results proposed of all the power analysis attacks were obtained.The ultimate attack object is derived by getting the solution of the equations.The experimental results show that the oposed attack method was effective.The method can be used universally because its being available for both the situation of co-exist of hamming weight with Hamming distance,and that of either the Hamming weight or choosing the Hamming distance model existence.
Wireless relay can solve the problems of covering blind spots,which was caused by lack of cable transmission,difficult at station site property coordination of macro base station,high-rise block. By two means of actual field test and network simulation,the coverage performance of relay was assessed comprehensively at outdoor coverage and indoor blind spots. By analyzing the distance between relay and buildings,the relationship between relay deployed position height and the interior depth coverage,relay deployment recommendations were given for solving depth coverage gaps.
Existing cooperative routings cannot fairly allocate the wireless resources,which results in that the throughput of minimum flow cannot satisfy the performance requirement.The multiple flow cooperative routing problem is formulated as a convex optimization problem with the goal of maximizing network utility.Based on dual decomposition and subgradient method,a distributed fair cooperative routing algorithm in multi-gates wireless mesh network-FCRMG is proposed.The simulation results show that,FCRMG can largely improve the throughput of the minimum flow without decreasing the total network throughput,compared with uncooperative routing based on expected transmission time metric and cooperative routing based on contention aware metric.
With the development of the terminal and broadband network technology,smart home related services have been developed rapidly.The latest development of the smart home applications and related home network technology trends were analyzed firstly,and the key elements of the operating services of the smart home were demonstrated.On this basis,the system architecture of the operators to build smart home services was presented,and the related key technologies involved were analyzed,and the feasibility of the system was verified.
A robust halftone image watermarking method was proposed.The method was developed in parity domain based on pixel block.Especially,the parity sum of a pixel block was defined by comparing the average of the pixel block with an image-dependent threshold.By altering the pixel block's parity based on noise-balanced block error diffusion,watermark was spread into the host image.Watermark was retrieved by employing each pixel block's parity and majority voting strategy,not referring to the original image.Compared with the state-of-the-art method in parity domain,the re-sults indicate that the proposed method has high watermark rate and watermark rate flexibility.Moreover,it is capable of extracting watermark directly from the attacked watermarked image without quantizing it into a halftone image.And it can achieve high robustness against common attacks and print-and-scan attack of different types of printers and scanners.
The paper analyzes implementations of BGP protocol on commercial and open-source routers and presents how some existing BGP extensions and routing table isolation mechanisms may be used to solve issues found in standard BGP implementation.
ABTEM was presented, an availability based trust model, and many availability oriented security service can be obtained based on this model. The performance results show that malicious nodes can be successfully discovered in accordance with its malicious acts in the routing layer when apply ABTEM on the DSR protocol. Furthermore malicious nodes can be isolated to protect the network from the attack of them, and which will significantly increases system avail- ability.
No momento em que se volta a discutir, no Brasil, uma nova legislação para as telecomunicações (aí incluídas radiodifusão e televisão por assinatura), este artigo pretende resgatar princípios constantes de marcos anteriores, enfocando a sua adequação ao cenário contemporâneo face às mudanças por que passou o país nas últimas décadas. Ao longo do texto debate-se a convergência entre serviços, o interesse público, as finalidades da programação de radiodifusão e o papel do Estado como agente regulador. Acredita-se que esses devem servir de norte à nova legislação, sendo imprescindíveis para que o marco ora discutido não se configure apenas como alternativa à resolução de problemas conjunturais. <b>Palavras-chave:</b> Radiodifusão. Telecomunicações. Legislação. Regulação. By the time when, in Brazil, a new legislation for telecommunication is being discussed (including broadcasting and Pay-per-view TV), this article intends to bring back to subject some constant principles of past marks, focusing their suitableness to the contemporary scene while facing the changes this country has been trough in the last decades. Inside the text, there is a debate concerning the convergence between services; the public interest; the aims of broadcasting programming and the State’s role as a regulator agent. It is thought that these should serve as a direction to the new legislation, becoming essential for the confi guration of the then discussed mark – it cannot appear just as an alternative to solve situation problems. <b>Keywords:</b> Broadcasting. Telecommunications. Legislation. Regulation.