You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
there is memory error when deduplicate Chinese data.
23/04/19 19:44:17 WARN MemoryStore: Not enough space to cache rdd_7_0 in memory! (computed 176.2 MiB so far)
23/04/19 19:44:17 WARN BlockManager: Block rdd_7_0 could not be removed as it was not found on disk or in memory
23/04/19 19:44:17 WARN BlockManager: Putting block rdd_7_0 failed
23/04/19 19:44:17 WARN MemoryStore: Not enough space to cache rdd_7_2 in memory! (computed 176.2 MiB so far)
23/04/19 19:44:17 WARN BlockManager: Block rdd_7_2 could not be removed as it was not found on disk or in memory
23/04/19 19:44:17 WARN BlockManager: Putting block rdd_7_2 failed
23/04/19 19:44:17 WARN MemoryStore: Not enough space to cache rdd_7_9 in memory! (computed 176.3 MiB so far)
23/04/19 19:44:17 WARN BlockManager: Block rdd_7_9 could not be removed as it was not found on disk or in memory
23/04/19 19:44:17 WARN BlockManager: Putting block rdd_7_9 failed
23/04/19 19:44:19 WARN MemoryStore: Not enough space to cache rdd_7_7 in memory! (computed 176.6 MiB so far)
23/04/19 19:44:19 WARN BlockManager: Block rdd_7_7 could not be removed as it was not found on disk or in memory
23/04/19 19:44:19 WARN BlockManager: Putting block rdd_7_7 failed
23/04/19 19:44:41 WARN BlockManager: Block rdd_7_6 could not be removed as it was not found on disk or in memory
23/04/19 19:44:42 ERROR Executor: Exception in task 6.0 in stage 1.0 (TID 30545)
java.lang.OutOfMemoryError: Java heap space
The text was updated successfully, but these errors were encountered:
The mem space for each executor is smaller than the input partition data size.
You can try to increase the number of partitions.
Also, could you check spark history to see which executors have data skew and spill issues?
This deduplication logic still needs to improve for these problems
there is memory error when deduplicate Chinese data.
The text was updated successfully, but these errors were encountered: