Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

websocket Handle bottleneck due to the json message conversion to eprosima xtypes by the integration service #173

Open
gedeon1976 opened this issue Jan 8, 2022 · 0 comments

Comments

@gedeon1976
Copy link

Another issue that I'm experimenting is a websocket Handle bottleneck due to the json message conversion to eprosima xtypes by the integration service. This is happening with Compressed images.

This seems to happen only when you are using the websocket handle, at least what i have been tested
I have added some time measurement to the convert__msg.cpp.em template to show the duration between each received frame,

  void subscription_callback(
          const Ros2_Msg& msg)
  {
      auto start = std::chrono::steady_clock::now();
      logger << utils::Logger::Level::INFO
             << "Receiving message from ROS 2 for topic '" << _topic_name << "'" 
             << "time from last frame: " << std::chrono::duration_cast<std::chrono::milliseconds>(start - t_last).count() << "ms "
             << std::endl;

      xtypes::DynamicData data(_message_type);
      convert_to_xtype(msg, data);

      //logger << utils::Logger::Level::INFO
              //<< "Received message: [[ " << data << " ]]" << std::endl;
              //<< "Received message: [[ " " ]]" << std::endl;  // change to this to avoid data printing

      (*_callback)(data, nullptr);
      t_last = start;
  }

and added the next variable to the class

std::chrono::time_point<std::chrono::steady_clock> t_last;

For example the integration service is able to connect to the ROS2 topics with almost good processing time through a WIFI connection,

integration-service

But when I get connected from a webpage using roslibjs through a WIFI connection, an increase of 10X processing time is introduced,
the video reception on the web is going very slow around 1-2 fps
You can check the difference after connecting from the image below,

integration-service-websocket-sh

I perform some profiling and the issue seems to be related with the message conversion on convert_to_xtype(msg, data);

profiling

From profiling also we have,

The following functions spent a lot of time!!

On websocket::EndPoint::publish 

Websocket::JsonEncoding::encode_publication_msg() 

Eprosima::is::json_xtypes_to_json 

Eprosima::is::json_xtypes::add_json_node 

nlohmann::is::basic_json<std::map, std::vector, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char>>, bool, long, unsigned long, double, std::allocator, nlohmann::is::adl_serializer>::operator[] 

std::vector<nlohmann::is::basic_json<std::map, std::vector, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char>>, bool, long, unsigned long, double, std::allocator, nlohmann::is::adl_serializer>, std::allocator<nlohmann::is::basic_json<std::map, std::vector, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char>>, bool, long, unsigned long, double, std::allocator, nlohmann::is::adl_serializer>>>::insert 

So, I'm not sure if somebody from eProsima can check this behavior when using Websocket-SH handler?

Originally posted by @gedeon1976 in #169 (comment)

@gedeon1976 gedeon1976 changed the title **websocket Handle bottleneck** due to the json message conversion to **eprosima xtypes** by the integration service websocket Handle bottleneck due to the json message conversion to eprosima xtypes by the integration service Jan 8, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant