When we return gzipped data from an Azure function back to cloudscript, it appears playfab does not unzip the data when writing it to the functions result.
This is an example data we receive on the client:
{"code":200,"status":"OK","data":{"ExecutionTimeMilliseconds":343,"FunctionName":"PlayerLogin","FunctionResult":"�\b\0\0\0\0\0uS]o�0�+��Cd��<�T=�:�q&Y8��A��+E��];�T��%�gf���ə���X�\n��J���Ԃ�HjM\rY�y�R\aX�F$%�r֡��ي�T��+B&�/�u���R�Ծ�[U���,3�7$��8}��>��Y�!���ײP��,�Ӱ�y`*���ǽn\fO*�Y�C�q5�K����c\b�̗7?o�g�������a7�����,\n�۞�%�B�]i�ܔ\n%M��\\����Dn�9�\f����&�p��V����,� �[��T�����i���{�E�q��\n\f�8+��T���\bC�h�\v\\\0��#pUa��`\"/��0�Eq�v{\b����w�� �+�����ʮ�9����������8eI(D��\0sP�[���Pi2�l��q:~o��1�(�q��&�F��D�EHc�e�[��`N���s�6�ĹY4��uqÒ���n�WZRe�2�\a ����\v1]U��!��yi�O�@��\\�G�\aq�o�AC��z&\n�\0\0","FunctionResultTooLarge":false}}
Flow:
Client calls ExecuteFunction (accept-encoding:gzip)
Playfab calls Azure (accept-encoding:gzip, content-encoding:"")
We return gzipped data (content-encoding:gzip)
Playfab returns to the client
Here, FunctionResult is still the gzipped content
We would assume playfab decompressed the content it received from azure, builds the final json and gzippes it again when returning to the client
The major problem here is that suddenly the returned data size can increase dramatically when switching from cloudscript to azure, resulting in multiple failed functions because of a too big data size returned from azure to playfab
P.s.: The same issue happens when using ExecuteFunction.cs for local azure debugging. It is clearly visible in the code that the returned data from Azure is never unzipped but just written into the FunctionResults property and the entire JSON is than compresses again if accept-encoding:gzip is set
P.p.s: It also looks like that playfab never compresses the data it returns to the client with ExecuteFunction, regardless if CompressApiData is true or accept-encoding is set to gzip