Sports Analytics and Performance Optimization

Contents

Sports Analytics and Performance Optimization#

In this chapter we are going to try to model members of the amateur football club. This is related to one of the author’s contributions to a wellbeing of a local football community. The ultimate goal is to split membors into a reasonable teams so that it is as enjoyable and competitive from sport’s perspoective as possible.

The challenge#

For every Sunday’s game we organise a poll to see who is joining a game. Coomon agreement is that we need minimum 10 and maximum 18 players - so it means we have a variable size teams.

  • even though we have started to collect data for our game we still have around 20 games recorded. Distribution of the players also changed over the course of the recoring period so we have even less data for every single player.

The plan#

import tensorflow as tf
from tensorflow.keras import layers, Model, Input, initializers
from datetime import date
import numpy as np
import os

    
from competition_manager import *
2025-10-31 12:34:41.499918: I external/local_xla/xla/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used.
2025-10-31 12:34:41.502972: I external/local_xla/xla/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used.
2025-10-31 12:34:41.511442: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:477] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1761914081.525535    2249 cuda_dnn.cc:8310] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1761914081.529666    2249 cuda_blas.cc:1418] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
2025-10-31 12:34:41.545034: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
import random


def set_seed(seed=42):
    np.random.seed(seed)                  # Fix NumPy random seed
    random.seed(seed)                     # Fix Python built-in random seed
    tf.random.set_seed(seed)              # Fix TensorFlow random seed

    # Optional but recommended: configure TensorFlow for deterministic ops
    os.environ['TF_DETERMINISTIC_OPS'] = '1'

# Call this function at the very start, before building or training your model
seed_value = 42
set_seed(seed_value)

Generate synthetic teams and games#

Calculating team’s strength based on players individual strengths and rule-based interactions#

def calculate_team_strength(team_players):
    # Base strength sum
    strength = player_strengths[team_players].sum()
    print(f"Base strength of team {team_players}: {strength:.4f}")
    
    # Compute favorite player boost (fixed)
    for i, pair in enumerate(friend_pairs):
        if all(player in team_players for player in pair):
            boost = friend_pairs_boost[i]
            strength += boost
            print(f"Favorite pair boost applied for players {pair}: +{boost:.4f}, total strength now {strength:.4f}")
    for i, triplet in enumerate(friend_triplets):
        if all(player in team_players for player in triplet):
            boost = friend_triplets_boost[i]
            strength += boost
            print(f"Favorite triplet boost applied for players {triplet}: +{boost:.4f}, total strength now {strength:.4f}")
    
    # Compute skills boost correlated with players' average strength
    for i, pair in enumerate(skilled_pairs):
        if all(player in team_players for player in pair):
            avg_strength = player_strengths[list(pair)].mean()
            boost = skilled_pairs_boost[i] * avg_strength
            strength += boost
            print(f"Skills pair boost for players {pair}: avg strength {avg_strength:.4f} * boost factor {skilled_pairs_boost[i]:.4f} = +{boost:.4f}, total strength now {strength:.4f}")
    
    for i, triplet in enumerate(skilled_triplets):
        if all(player in team_players for player in triplet):
            avg_strength = player_strengths[list(triplet)].mean()
            boost = skilled_triplets_boost[i] * avg_strength
            strength += boost
            print(f"Skills triplet boost for players {triplet}: avg strength {avg_strength:.4f} * boost factor {skilled_triplets_boost[i]:.4f} = +{boost:.4f}, total strength now {strength:.4f}")
    
    return strength

Generating the teams, friend pairs/triplets and skills pairs/triplets#

np.random.seed(42)  # for reproducibility

NUM_PLAYERS = 30  # player 0 is ignored/masked and we will need to account for this
MIN_TEAM_SIZE = 5
MAX_TEAM_SIZE = 9
NUM_GAMES = 100

# 1. Generate players' strengths: single float number [0, 1]
player_strengths = np.random.rand(NUM_PLAYERS + 1) # player 0 is ignore/masked 

# 2. Generate favorite player pairs and triplets (friends)
num_friend_pairs = 10
num_friend_triplets = 5

# Randomly select unique pairs
friend_pairs = [tuple(np.random.choice(np.arange(1, NUM_PLAYERS +1), size=2, replace=False)) for _ in range(num_friend_pairs)]
friend_pairs_boost = np.random.uniform(0.05, 0.15, size=num_friend_pairs)  # small boost

# Randomly select unique triplets
friend_triplets = [tuple(np.random.choice(np.arange(1, NUM_PLAYERS +1), size=3, replace=False)) for _ in range(num_friend_triplets)]
friend_triplets_boost = np.random.uniform(0.1, 0.25, size=num_friend_triplets)  # larger boost

# 3. Generate skilled pairs and triplets (high skill synergy)
num_skilled_pairs = 8
num_skilled_triplets = 4

skilled_pairs = [tuple(np.random.choice(np.arange(1, NUM_PLAYERS + 1), size=2, replace=False)) for _ in range(num_skilled_pairs)]
skilled_pairs_boost = np.random.uniform(0.1, 0.2, size=num_skilled_pairs)  # moderate boost

skilled_triplets = [tuple(np.random.choice(np.arange(1, NUM_PLAYERS + 1), size=3, replace=False)) for _ in range(num_skilled_triplets)]
skilled_triplets_boost = np.random.uniform(0.15, 0.3, size=num_skilled_triplets)  # strong boost

# 2. Prepare arrays to hold the dataset
teamA_data = np.zeros((NUM_GAMES, MAX_TEAM_SIZE), dtype=int)
teamB_data = np.zeros((NUM_GAMES, MAX_TEAM_SIZE), dtype=int)
outcomes = np.zeros(NUM_GAMES)

def drop_zeroes_for_sum(players_strengths):
    return players_strengths[players_strengths !=0]

for game_i in range(NUM_GAMES):
    # Random sizes for both teams in [MIN_TEAM_SIZE, MAX_TEAM_SIZE]

    team_size = np.random.randint(MIN_TEAM_SIZE, MAX_TEAM_SIZE + 1)

    # Randomly sample distinct players for each team (sampling with replacement allowed for simplicity)
    # To avoid overlap if needed: sample without replacement from full 32 for both teams combined,
    # here assuming players can appear on both teams (as per original conversation)
    # teamA_players = np.random.choice(NUM_PLAYERS, size=teamA_size, replace=False)
    # teamB_players = np.random.choice(NUM_PLAYERS, size=teamB_size, replace=False)
    
    # Shuffle all players and split into two disjoint teams
    all_players = np.random.permutation(np.arange(1, NUM_PLAYERS + 1))
    teamA_players = all_players[:team_size]
    teamB_players = all_players[team_size:2*team_size]
    
    print(f"Game # {game_i} evaluation: ")
    # Compute team strengths as sum of player strengths
    teamA_strength = calculate_team_strength(teamA_players)
    teamB_strength = calculate_team_strength(teamB_players)
    print(f"=" * 50)
    
    # print(f"Team A strengths {player_strengths[teamA_players]} Total: {teamA_strength}")

    # Calculate match outcome: 1 if Team A wins, 0 if Team B wins
    # Add small noise to simulate unpredictability
    outcome = teamA_strength - teamB_strength + np.random.normal(scale=0.1)
    outcomes[game_i] = outcome

    # Pad teams to max size using zeros (which corresponds to masked player)
    teamA_data[game_i, :team_size] = teamA_players
    teamB_data[game_i, :team_size] = teamB_players

print("player_strengths shape:", player_strengths.shape)
print("teamA_data shape:", teamA_data.shape)
print("teamB_data shape:", teamB_data.shape)
print("outcomes shape:", outcomes.shape)

# Example print first 3 games
for i in range(3):
    print(f"Game {i}:")
    teamA_pls = teamA_data[i]
    print(" Team A players: ", teamA_pls)
    print(" Team A palyers' stregths: ", player_strengths[teamA_pls])

    teamB_pls = teamB_data[i]
    print(" Team B players: ", teamB_pls)
    print(f"Team B strengths {drop_zeroes_for_sum(player_strengths[teamB_pls]).sum()}")
    print(" Team B palyers' stregths: ", player_strengths[teamB_pls])
    print(" Label (Team A wins=1):", outcomes[i])
Game # 0 evaluation: 
Base strength of team [ 4  8 19 24 23  3 26  7 25]: 4.3205
Base strength of team [30 15 16 27 11 13 29 20 17]: 3.9747
Favorite pair boost applied for players (np.int64(30), np.int64(20)): +0.0999, total strength now 4.0747
Favorite pair boost applied for players (np.int64(11), np.int64(15)): +0.1495, total strength now 4.2241
Skills pair boost for players (np.int64(20), np.int64(11)): avg strength 0.7909 * boost factor 0.1480 = +0.1171, total strength now 4.3412
==================================================
Game # 1 evaluation: 
Base strength of team [16  9 21  4  5  1  8 17  6]: 3.5985
Favorite pair boost applied for players (np.int64(16), np.int64(4)): +0.1072, total strength now 3.7057
Favorite pair boost applied for players (np.int64(16), np.int64(8)): +0.1269, total strength now 3.8325
Base strength of team [10 26 12 11 30 18 28  3  7]: 5.1193
Skills pair boost for players (np.int64(10), np.int64(12)): avg strength 0.4265 * boost factor 0.1377 = +0.0587, total strength now 5.1781
==================================================
Game # 2 evaluation: 
Base strength of team [28 29 24  5 18]: 1.6829
Base strength of team [27 19 14  1 11]: 2.9079
Skills pair boost for players (np.int64(27), np.int64(14)): avg strength 0.3480 * boost factor 0.1411 = +0.0491, total strength now 2.9570
==================================================
Game # 3 evaluation: 
Base strength of team [26 10 12 17 14 20 18]: 2.8031
Favorite pair boost applied for players (np.int64(17), np.int64(14)): +0.0780, total strength now 2.8810
Skills pair boost for players (np.int64(10), np.int64(12)): avg strength 0.4265 * boost factor 0.1377 = +0.0587, total strength now 2.9398
Base strength of team [ 2 23 30 19 28  8 15]: 3.3741
==================================================
Game # 4 evaluation: 
Base strength of team [20 29 22 18 26]: 1.5821
Base strength of team [ 9  7 16 23 21]: 2.3843
Favorite pair boost applied for players (np.int64(7), np.int64(21)): +0.1383, total strength now 2.5227
==================================================
Game # 5 evaluation: 
Base strength of team [16 22 17 14 21 30 29  7 27]: 3.4769
Favorite pair boost applied for players (np.int64(17), np.int64(14)): +0.0780, total strength now 3.5548
Favorite pair boost applied for players (np.int64(7), np.int64(21)): +0.1383, total strength now 3.6932
Skills pair boost for players (np.int64(27), np.int64(14)): avg strength 0.3480 * boost factor 0.1411 = +0.0491, total strength now 3.7423
Skills pair boost for players (np.int64(16), np.int64(22)): avg strength 0.2982 * boost factor 0.1985 = +0.0592, total strength now 3.8015
Base strength of team [26 19 15  9 23 28  4 20  8]: 3.7101
==================================================
Game # 6 evaluation: 
Base strength of team [ 4  7 20 16 27  6 30 13]: 3.3305
Favorite pair boost applied for players (np.int64(30), np.int64(20)): +0.0999, total strength now 3.4304
Favorite pair boost applied for players (np.int64(16), np.int64(4)): +0.1072, total strength now 3.5376
Skills pair boost for players (np.int64(30), np.int64(6)): avg strength 0.3328 * boost factor 0.1552 = +0.0516, total strength now 3.5893
Skills pair boost for players (np.int64(20), np.int64(6)): avg strength 0.3350 * boost factor 0.1165 = +0.0390, total strength now 3.6283
Base strength of team [10 17 15 28 23 29  5 25]: 2.6751
==================================================
Game # 7 evaluation: 
Base strength of team [11 20 21  9 27  5 10]: 3.1201
Favorite pair boost applied for players (np.int64(9), np.int64(10)): +0.1248, total strength now 3.2449
Skills pair boost for players (np.int64(20), np.int64(11)): avg strength 0.7909 * boost factor 0.1480 = +0.1171, total strength now 3.3620
Base strength of team [12  4 18 13 24 16 22]: 2.6852
Favorite pair boost applied for players (np.int64(24), np.int64(16)): +0.0730, total strength now 2.7582
Favorite pair boost applied for players (np.int64(16), np.int64(4)): +0.1072, total strength now 2.8654
Skills pair boost for players (np.int64(16), np.int64(22)): avg strength 0.2982 * boost factor 0.1985 = +0.0592, total strength now 2.9246
==================================================
Game # 8 evaluation: 
Base strength of team [ 3  7 21 17 15 11  9]: 3.9905
Favorite pair boost applied for players (np.int64(11), np.int64(15)): +0.1495, total strength now 4.1399
Favorite pair boost applied for players (np.int64(7), np.int64(21)): +0.1383, total strength now 4.2783
Favorite triplet boost applied for players (np.int64(11), np.int64(3), np.int64(9)): +0.1988, total strength now 4.4771
Base strength of team [19 18 12 14  8 22 13]: 2.8430
Favorite pair boost applied for players (np.int64(18), np.int64(8)): +0.0970, total strength now 2.9400
==================================================
Game # 9 evaluation: 
Base strength of team [ 7 16 19 28 21 17  3  4]: 3.4730
Favorite pair boost applied for players (np.int64(16), np.int64(4)): +0.1072, total strength now 3.5802
Favorite pair boost applied for players (np.int64(7), np.int64(21)): +0.1383, total strength now 3.7185
Base strength of team [18  1  9  2 30 13 24 12]: 4.9311
Favorite triplet boost applied for players (np.int64(24), np.int64(2), np.int64(18)): +0.1755, total strength now 5.1066
==================================================
Game # 10 evaluation: 
Base strength of team [ 2 13 15 25 16  3 12 20 24]: 4.7162
Favorite pair boost applied for players (np.int64(24), np.int64(16)): +0.0730, total strength now 4.7892
Base strength of team [21  7  6  4 18 29 28  5 23]: 2.8129
Favorite pair boost applied for players (np.int64(7), np.int64(21)): +0.1383, total strength now 2.9513
==================================================
Game # 11 evaluation: 
Base strength of team [19 27 10 15 13 16  1  5 14]: 2.8146
Favorite pair boost applied for players (np.int64(15), np.int64(14)): +0.0544, total strength now 2.8689
Skills pair boost for players (np.int64(27), np.int64(14)): avg strength 0.3480 * boost factor 0.1411 = +0.0491, total strength now 2.9180
Base strength of team [ 4 22 29  8 24 28 21 12 30]: 3.7237
==================================================
Game # 12 evaluation: 
Base strength of team [23  2  4 15 19  5]: 1.8850
Base strength of team [10  6 12 30 11  3]: 3.0872
Skills pair boost for players (np.int64(30), np.int64(6)): avg strength 0.3328 * boost factor 0.1552 = +0.0516, total strength now 3.1389
Skills pair boost for players (np.int64(10), np.int64(12)): avg strength 0.4265 * boost factor 0.1377 = +0.0587, total strength now 3.1976
==================================================
Game # 13 evaluation: 
Base strength of team [ 9 23 13 25 11]: 3.0419
Base strength of team [ 5 12 20 30 24]: 2.6639
Favorite pair boost applied for players (np.int64(30), np.int64(20)): +0.0999, total strength now 2.7638
==================================================
Game # 14 evaluation: 
Base strength of team [21 13 28 18  7 23  9]: 3.3168
Favorite pair boost applied for players (np.int64(7), np.int64(21)): +0.1383, total strength now 3.4552
Base strength of team [10  6 27 12  4 11 17]: 3.0760
Skills pair boost for players (np.int64(10), np.int64(12)): avg strength 0.4265 * boost factor 0.1377 = +0.0587, total strength now 3.1347
==================================================
Game # 15 evaluation: 
Base strength of team [ 7  8 21 28 25 13]: 3.1967
Favorite pair boost applied for players (np.int64(7), np.int64(21)): +0.1383, total strength now 3.3351
Base strength of team [29 24 17  4 22 18]: 1.9074
==================================================
Game # 16 evaluation: 
Base strength of team [ 3 15 21 10 25]: 1.7273
Base strength of team [28 12  1 30 29]: 3.0296
==================================================
Game # 17 evaluation: 
Base strength of team [28 11 20 18 14  9]: 3.4960
Skills pair boost for players (np.int64(20), np.int64(11)): avg strength 0.7909 * boost factor 0.1480 = +0.1171, total strength now 3.6131
Base strength of team [ 4 15  3  1 26 24]: 2.5445
==================================================
Game # 18 evaluation: 
Base strength of team [10 29 16 26  6 28 14]: 1.4033
Base strength of team [21 12  9 23 13 18 20]: 3.3025
==================================================
Game # 19 evaluation: 
Base strength of team [24 16  4 23 15 13 20]: 2.2903
Favorite pair boost applied for players (np.int64(24), np.int64(16)): +0.0730, total strength now 2.3633
Favorite pair boost applied for players (np.int64(16), np.int64(4)): +0.1072, total strength now 2.4705
Base strength of team [14 11 30  8 21  1  7]: 4.3168
Favorite pair boost applied for players (np.int64(7), np.int64(21)): +0.1383, total strength now 4.4551
Skills pair boost for players (np.int64(8), np.int64(11)): avg strength 0.7855 * boost factor 0.1778 = +0.1396, total strength now 4.5948
==================================================
Game # 20 evaluation: 
Base strength of team [ 1 24 11 28  4]: 3.1251
Base strength of team [ 6 22 21 19  2]: 1.5129
==================================================
Game # 21 evaluation: 
Base strength of team [14 18 20 25  9  6 23]: 3.1433
Skills pair boost for players (np.int64(20), np.int64(6)): avg strength 0.3350 * boost factor 0.1165 = +0.0390, total strength now 3.1823
Base strength of team [24 26 21 16 22 10 19]: 1.7034
Favorite pair boost applied for players (np.int64(24), np.int64(16)): +0.0730, total strength now 1.7765
Skills pair boost for players (np.int64(16), np.int64(22)): avg strength 0.2982 * boost factor 0.1985 = +0.0592, total strength now 1.8357
Skills pair boost for players (np.int64(10), np.int64(22)): avg strength 0.1564 * boost factor 0.1750 = +0.0274, total strength now 1.8630
==================================================
Game # 22 evaluation: 
Base strength of team [13 19  9 18 24 26]: 2.2993
Base strength of team [25  6 11  4  2  8]: 3.3023
Skills pair boost for players (np.int64(8), np.int64(11)): avg strength 0.7855 * boost factor 0.1778 = +0.1396, total strength now 3.4419
==================================================
Game # 23 evaluation: 
Base strength of team [16 22 10 18 30 15  9  1]: 3.4987
Favorite pair boost applied for players (np.int64(9), np.int64(10)): +0.1248, total strength now 3.6234
Skills pair boost for players (np.int64(16), np.int64(22)): avg strength 0.2982 * boost factor 0.1985 = +0.0592, total strength now 3.6826
Skills pair boost for players (np.int64(10), np.int64(22)): avg strength 0.1564 * boost factor 0.1750 = +0.0274, total strength now 3.7100
Base strength of team [19  3 28 24 20 17  8 21]: 3.8156
==================================================
Game # 24 evaluation: 
Base strength of team [12 21  7  1 14  2 30 13  4]: 4.6785
Favorite pair boost applied for players (np.int64(7), np.int64(21)): +0.1383, total strength now 4.8169
Base strength of team [11 25 16  6 28 27  8 29 19]: 4.1629
Favorite pair boost applied for players (np.int64(16), np.int64(8)): +0.1269, total strength now 4.2897
Skills pair boost for players (np.int64(8), np.int64(11)): avg strength 0.7855 * boost factor 0.1778 = +0.1396, total strength now 4.4293
==================================================
Game # 25 evaluation: 
Base strength of team [20  7  4 14 26  9 21 30 10]: 3.4912
Favorite pair boost applied for players (np.int64(30), np.int64(20)): +0.0999, total strength now 3.5912
Favorite pair boost applied for players (np.int64(7), np.int64(21)): +0.1383, total strength now 3.7295
Favorite pair boost applied for players (np.int64(9), np.int64(10)): +0.1248, total strength now 3.8543
Skills triplet boost for players (np.int64(10), np.int64(4), np.int64(20)): avg strength 0.2628 * boost factor 0.2501 = +0.0657, total strength now 3.9200
Base strength of team [ 3 29 28  1 17 19 12 13 22]: 4.3411
==================================================
Game # 26 evaluation: 
Base strength of team [15 27 12 29 19  2 14]: 2.7816
Favorite pair boost applied for players (np.int64(15), np.int64(14)): +0.0544, total strength now 2.8359
Skills pair boost for players (np.int64(27), np.int64(14)): avg strength 0.3480 * boost factor 0.1411 = +0.0491, total strength now 2.8851
Base strength of team [18 26 17  3 25 20 22]: 3.4442
==================================================
Game # 27 evaluation: 
Base strength of team [23 11  1 22 18 15 21 25 26]: 4.3188
Favorite pair boost applied for players (np.int64(11), np.int64(15)): +0.1495, total strength now 4.4683
Skills triplet boost for players (np.int64(18), np.int64(15), np.int64(21)): avg strength 0.2516 * boost factor 0.1758 = +0.0442, total strength now 4.5125
Base strength of team [ 4 20  5 30 12 24 29 10  7]: 3.7531
Favorite pair boost applied for players (np.int64(30), np.int64(20)): +0.0999, total strength now 3.8531
Skills pair boost for players (np.int64(10), np.int64(12)): avg strength 0.4265 * boost factor 0.1377 = +0.0587, total strength now 3.9118
Skills triplet boost for players (np.int64(10), np.int64(4), np.int64(20)): avg strength 0.2628 * boost factor 0.2501 = +0.0657, total strength now 3.9775
==================================================
Game # 28 evaluation: 
Base strength of team [ 5 27 19 21  9]: 1.8090
Base strength of team [17  8  2 16 30]: 2.7697
Favorite pair boost applied for players (np.int64(16), np.int64(8)): +0.1269, total strength now 2.8965
==================================================
Game # 29 evaluation: 
Base strength of team [ 7 10  8 13 25 27  1 18 19]: 4.6735
Favorite pair boost applied for players (np.int64(18), np.int64(8)): +0.0970, total strength now 4.7705
Skills triplet boost for players (np.int64(27), np.int64(10), np.int64(8)): avg strength 0.3786 * boost factor 0.1788 = +0.0677, total strength now 4.8382
Base strength of team [24  9 21  6  4 23  5 16 26]: 2.5440
Favorite pair boost applied for players (np.int64(24), np.int64(16)): +0.0730, total strength now 2.6170
Favorite pair boost applied for players (np.int64(16), np.int64(4)): +0.1072, total strength now 2.7242
Favorite triplet boost applied for players (np.int64(5), np.int64(21), np.int64(24)): +0.1939, total strength now 2.9181
==================================================
Game # 30 evaluation: 
Base strength of team [10  1  6 23  9 22 19 13 28]: 3.4919
Favorite pair boost applied for players (np.int64(9), np.int64(10)): +0.1248, total strength now 3.6167
Skills pair boost for players (np.int64(10), np.int64(22)): avg strength 0.1564 * boost factor 0.1750 = +0.0274, total strength now 3.6441
Base strength of team [ 5 30  7  4  2 20 11 27 21]: 4.7532
Favorite pair boost applied for players (np.int64(30), np.int64(20)): +0.0999, total strength now 4.8531
Favorite pair boost applied for players (np.int64(7), np.int64(21)): +0.1383, total strength now 4.9915
Skills pair boost for players (np.int64(20), np.int64(11)): avg strength 0.7909 * boost factor 0.1480 = +0.1171, total strength now 5.1086
==================================================
Game # 31 evaluation: 
Base strength of team [23  5 20  4  7]: 2.1564
Base strength of team [10  3 18 12 19]: 2.1749
Skills pair boost for players (np.int64(10), np.int64(12)): avg strength 0.4265 * boost factor 0.1377 = +0.0587, total strength now 2.2336
==================================================
Game # 32 evaluation: 
Base strength of team [29 12  3  5 28 21 11 24]: 3.7914
Favorite triplet boost applied for players (np.int64(5), np.int64(21), np.int64(24)): +0.1939, total strength now 3.9853
Base strength of team [ 7  6 30  4 27 13  9 17]: 3.6472
Skills pair boost for players (np.int64(30), np.int64(6)): avg strength 0.3328 * boost factor 0.1552 = +0.0516, total strength now 3.6989
==================================================
Game # 33 evaluation: 
Base strength of team [ 5 22 21 13  1 30 14]: 2.5401
Base strength of team [ 2 10 25 20  4 16 23]: 2.9762
Favorite pair boost applied for players (np.int64(16), np.int64(4)): +0.1072, total strength now 3.0834
Skills triplet boost for players (np.int64(10), np.int64(4), np.int64(20)): avg strength 0.2628 * boost factor 0.2501 = +0.0657, total strength now 3.1492
==================================================
Game # 34 evaluation: 
Base strength of team [11  1 17  7 21 20 10]: 4.0835
Favorite pair boost applied for players (np.int64(7), np.int64(21)): +0.1383, total strength now 4.2218
Skills pair boost for players (np.int64(20), np.int64(11)): avg strength 0.7909 * boost factor 0.1480 = +0.1171, total strength now 4.3389
Base strength of team [29  9 30  5 15  3 16]: 2.6044
==================================================
Game # 35 evaluation: 
Base strength of team [10  1  8 28 30 11 13 23  3]: 4.9196
Skills pair boost for players (np.int64(8), np.int64(11)): avg strength 0.7855 * boost factor 0.1778 = +0.1396, total strength now 5.0593
Base strength of team [ 7  6 15 14 21 22 16  2 25]: 3.5425
Favorite pair boost applied for players (np.int64(15), np.int64(14)): +0.0544, total strength now 3.5969
Favorite pair boost applied for players (np.int64(7), np.int64(21)): +0.1383, total strength now 3.7352
Skills pair boost for players (np.int64(16), np.int64(22)): avg strength 0.2982 * boost factor 0.1985 = +0.0592, total strength now 3.7944
==================================================
Game # 36 evaluation: 
Base strength of team [22 11 23 26 27 29]: 2.3888
Base strength of team [18 24 19  7 25 28]: 3.4230
==================================================
Game # 37 evaluation: 
Base strength of team [22 18  2  7  6 19 27 20 29]: 3.8441
Skills pair boost for players (np.int64(20), np.int64(6)): avg strength 0.3350 * boost factor 0.1165 = +0.0390, total strength now 3.8831
Base strength of team [26  5 30  9 28 17 25 21 16]: 4.0174
==================================================
Game # 38 evaluation: 
Base strength of team [16  7 18  8  3  1 10 22]: 4.0656
Favorite pair boost applied for players (np.int64(16), np.int64(8)): +0.1269, total strength now 4.1924
Favorite pair boost applied for players (np.int64(18), np.int64(8)): +0.0970, total strength now 4.2894
Skills pair boost for players (np.int64(16), np.int64(22)): avg strength 0.2982 * boost factor 0.1985 = +0.0592, total strength now 4.3486
Skills pair boost for players (np.int64(10), np.int64(22)): avg strength 0.1564 * boost factor 0.1750 = +0.0274, total strength now 4.3760
Base strength of team [14 24  9 19  2  4 13 29]: 2.7840
==================================================
Game # 39 evaluation: 
Base strength of team [16 30 14  3 20  9 17  1]: 4.4877
Favorite pair boost applied for players (np.int64(30), np.int64(20)): +0.0999, total strength now 4.5876
Favorite pair boost applied for players (np.int64(17), np.int64(14)): +0.0780, total strength now 4.6655
Base strength of team [ 5 22  8 23 19 10 25  4]: 2.6686
Skills pair boost for players (np.int64(10), np.int64(22)): avg strength 0.1564 * boost factor 0.1750 = +0.0274, total strength now 2.6960
Skills triplet boost for players (np.int64(19), np.int64(25), np.int64(22)): avg strength 0.4562 * boost factor 0.1561 = +0.0712, total strength now 2.7672
==================================================
Game # 40 evaluation: 
Base strength of team [ 6 25 13  7  2 16]: 2.9580
Base strength of team [14 20  5  3 28 30]: 2.7483
Favorite pair boost applied for players (np.int64(30), np.int64(20)): +0.0999, total strength now 2.8482
==================================================
Game # 41 evaluation: 
Base strength of team [12 17 10  8 18]: 2.4108
Favorite pair boost applied for players (np.int64(18), np.int64(8)): +0.0970, total strength now 2.5078
Skills pair boost for players (np.int64(10), np.int64(12)): avg strength 0.4265 * boost factor 0.1377 = +0.0587, total strength now 2.5666
Base strength of team [ 1 22  5 15  7]: 2.4484
==================================================
Game # 42 evaluation: 
Base strength of team [13 29 18 24  4 30 16 20]: 2.8265
Favorite pair boost applied for players (np.int64(24), np.int64(16)): +0.0730, total strength now 2.8995
Favorite pair boost applied for players (np.int64(30), np.int64(20)): +0.0999, total strength now 2.9994
Favorite pair boost applied for players (np.int64(16), np.int64(4)): +0.1072, total strength now 3.1066
Favorite triplet boost applied for players (np.int64(29), np.int64(16), np.int64(4)): +0.2285, total strength now 3.3351
Base strength of team [ 5 22  9  2 10 28 14  8]: 3.2841
Favorite pair boost applied for players (np.int64(9), np.int64(10)): +0.1248, total strength now 3.4089
Skills pair boost for players (np.int64(10), np.int64(22)): avg strength 0.1564 * boost factor 0.1750 = +0.0274, total strength now 3.4363
==================================================
Game # 43 evaluation: 
Base strength of team [19  8  1  7 27 25 10  2]: 4.7612
Skills triplet boost for players (np.int64(27), np.int64(10), np.int64(8)): avg strength 0.3786 * boost factor 0.1788 = +0.0677, total strength now 4.8289
Base strength of team [21 26  5 17  3 20  4  9]: 3.0945
==================================================
Game # 44 evaluation: 
Base strength of team [27  9  6 20 22 19  5]: 2.6316
Skills pair boost for players (np.int64(20), np.int64(6)): avg strength 0.3350 * boost factor 0.1165 = +0.0390, total strength now 2.6706
Base strength of team [21  1 26 14 25 24 12]: 3.5454
==================================================
Game # 45 evaluation: 
Base strength of team [12 14 20  7  6 28  3 19  2]: 4.7647
Skills pair boost for players (np.int64(20), np.int64(6)): avg strength 0.3350 * boost factor 0.1165 = +0.0390, total strength now 4.8037
Base strength of team [ 4  5 11 17  1 16  8 18 30]: 4.7022
Favorite pair boost applied for players (np.int64(16), np.int64(4)): +0.1072, total strength now 4.8094
Favorite pair boost applied for players (np.int64(16), np.int64(8)): +0.1269, total strength now 4.9363
Favorite pair boost applied for players (np.int64(18), np.int64(8)): +0.0970, total strength now 5.0333
Skills pair boost for players (np.int64(8), np.int64(11)): avg strength 0.7855 * boost factor 0.1778 = +0.1396, total strength now 5.1729
==================================================
Game # 46 evaluation: 
Base strength of team [14 29  8 30 25 13]: 2.4345
Base strength of team [26 16  9 28 11  7]: 3.6405
==================================================
Game # 47 evaluation: 
Base strength of team [ 2 18 12 21 25 26]: 3.1207
Base strength of team [16 24 30 15  1  5]: 2.6580
Favorite pair boost applied for players (np.int64(24), np.int64(16)): +0.0730, total strength now 2.7310
==================================================
Game # 48 evaluation: 
Base strength of team [29 11 15 10  9 23  4 13]: 2.6631
Favorite pair boost applied for players (np.int64(11), np.int64(15)): +0.1495, total strength now 2.8126
Favorite pair boost applied for players (np.int64(9), np.int64(10)): +0.1248, total strength now 2.9374
Base strength of team [16 14 19  1 22 17 20  7]: 4.0229
Favorite pair boost applied for players (np.int64(17), np.int64(14)): +0.0780, total strength now 4.1009
Skills pair boost for players (np.int64(16), np.int64(22)): avg strength 0.2982 * boost factor 0.1985 = +0.0592, total strength now 4.1601
==================================================
Game # 49 evaluation: 
Base strength of team [11 10 21 28  2  7]: 3.3206
Favorite pair boost applied for players (np.int64(7), np.int64(21)): +0.1383, total strength now 3.4589
Base strength of team [19 14 15  9  1 18]: 2.7472
Favorite pair boost applied for players (np.int64(15), np.int64(14)): +0.0544, total strength now 2.8016
==================================================
Game # 50 evaluation: 
Base strength of team [24 26  5 27 28]: 1.9184
Base strength of team [ 8 10 17  7 23]: 2.3790
==================================================
Game # 51 evaluation: 
Base strength of team [26  8 21 12 23  5  1]: 3.2458
Base strength of team [29 22 28 13  4 19 18]: 2.0225
==================================================
Game # 52 evaluation: 
Base strength of team [ 7  1 29 11 10]: 2.8538
Base strength of team [24  5 19 28 18]: 1.9277
==================================================
Game # 53 evaluation: 
Base strength of team [20 30 28 22  6 29]: 2.2085
Favorite pair boost applied for players (np.int64(30), np.int64(20)): +0.0999, total strength now 2.3084
Skills pair boost for players (np.int64(30), np.int64(6)): avg strength 0.3328 * boost factor 0.1552 = +0.0516, total strength now 2.3600
Skills pair boost for players (np.int64(20), np.int64(6)): avg strength 0.3350 * boost factor 0.1165 = +0.0390, total strength now 2.3991
Base strength of team [27  5 26  8 17 13]: 2.2081
==================================================
Game # 54 evaluation: 
Base strength of team [28 19  4 11  9 13]: 2.9300
Base strength of team [25 18 29 12  1 21]: 3.1862
==================================================
Game # 55 evaluation: 
Base strength of team [15 19 17 25 24  7 30 14]: 3.8962
Favorite pair boost applied for players (np.int64(15), np.int64(14)): +0.0544, total strength now 3.9505
Favorite pair boost applied for players (np.int64(17), np.int64(14)): +0.0780, total strength now 4.0285
Base strength of team [27 12 26  1 13 22  5 10]: 3.1781
Skills pair boost for players (np.int64(10), np.int64(12)): avg strength 0.4265 * boost factor 0.1377 = +0.0587, total strength now 3.2368
Skills pair boost for players (np.int64(10), np.int64(22)): avg strength 0.1564 * boost factor 0.1750 = +0.0274, total strength now 3.2642
==================================================
Game # 56 evaluation: 
Base strength of team [ 4 24  6 25  9 16]: 2.4677
Favorite pair boost applied for players (np.int64(24), np.int64(16)): +0.0730, total strength now 2.5407
Favorite pair boost applied for players (np.int64(16), np.int64(4)): +0.1072, total strength now 2.6479
Base strength of team [14 30 10 29 19  7]: 2.0138
==================================================
Game # 57 evaluation: 
Base strength of team [21 18 17  4 14  3 24 26]: 2.6884
Favorite pair boost applied for players (np.int64(17), np.int64(14)): +0.0780, total strength now 2.7664
Base strength of team [ 2 23  9 13 29  1 12 16]: 4.1526
==================================================
Game # 58 evaluation: 
Base strength of team [28  5 14 22 27]: 1.7366
Skills pair boost for players (np.int64(27), np.int64(14)): avg strength 0.3480 * boost factor 0.1411 = +0.0491, total strength now 1.7857
Base strength of team [ 6  3 19 15 30]: 1.7389
Skills pair boost for players (np.int64(30), np.int64(6)): avg strength 0.3328 * boost factor 0.1552 = +0.0516, total strength now 1.7906
==================================================
Game # 59 evaluation: 
Base strength of team [16  2  5 19  4]: 1.6395
Favorite pair boost applied for players (np.int64(16), np.int64(4)): +0.1072, total strength now 1.7467
Base strength of team [22  1 28 10  6]: 1.9139
Skills pair boost for players (np.int64(10), np.int64(22)): avg strength 0.1564 * boost factor 0.1750 = +0.0274, total strength now 1.9413
==================================================
Game # 60 evaluation: 
Base strength of team [22 10 14 16  4 26]: 1.1545
Favorite pair boost applied for players (np.int64(16), np.int64(4)): +0.1072, total strength now 1.2617
Skills pair boost for players (np.int64(16), np.int64(22)): avg strength 0.2982 * boost factor 0.1985 = +0.0592, total strength now 1.3209
Skills pair boost for players (np.int64(10), np.int64(22)): avg strength 0.1564 * boost factor 0.1750 = +0.0274, total strength now 1.3482
Base strength of team [27  2  8 24  9 30]: 3.6190
==================================================
Game # 61 evaluation: 
Base strength of team [ 8 15 24 29 28  3]: 2.4781
Base strength of team [18 20  5 23 19  1]: 2.8081
==================================================
Game # 62 evaluation: 
Base strength of team [16  2 17 19 23 21 11]: 3.3280
Base strength of team [13 18  1 27  9 12 29]: 3.6962
==================================================
Game # 63 evaluation: 
Base strength of team [10  8 20  5 21  9]: 2.2371
Favorite pair boost applied for players (np.int64(9), np.int64(10)): +0.1248, total strength now 2.3619
Base strength of team [ 3 11 29  2 26 15]: 2.7301
Favorite pair boost applied for players (np.int64(11), np.int64(15)): +0.1495, total strength now 2.8795
==================================================
Game # 64 evaluation: 
Base strength of team [23 24 27 12 28  8 29  3]: 4.0077
Base strength of team [ 2 18 25 19  5 10  9  7]: 3.9912
Favorite pair boost applied for players (np.int64(9), np.int64(10)): +0.1248, total strength now 4.1159
==================================================
Game # 65 evaluation: 
Base strength of team [18 15 16 11 12 25  9  8  1]: 5.7670
Favorite pair boost applied for players (np.int64(16), np.int64(8)): +0.1269, total strength now 5.8939
Favorite pair boost applied for players (np.int64(11), np.int64(15)): +0.1495, total strength now 6.0433
Favorite pair boost applied for players (np.int64(18), np.int64(8)): +0.0970, total strength now 6.1403
Skills pair boost for players (np.int64(8), np.int64(11)): avg strength 0.7855 * boost factor 0.1778 = +0.1396, total strength now 6.2800
Base strength of team [26  2 13  7 29 24  6  4 22]: 3.0190
==================================================
Game # 66 evaluation: 
Base strength of team [ 5 27  1 11 23  2 19 30]: 4.5880
Base strength of team [ 8 26 13 15  9 18  6 17]: 2.9194
Favorite pair boost applied for players (np.int64(18), np.int64(8)): +0.0970, total strength now 3.0164
==================================================
Game # 67 evaluation: 
Base strength of team [ 2  9  6 26 23  4]: 2.2202
Base strength of team [13 29 28 25  1  8]: 3.1882
==================================================
Game # 68 evaluation: 
Base strength of team [ 8  3 29 18 20 13]: 2.5024
Favorite pair boost applied for players (np.int64(18), np.int64(8)): +0.0970, total strength now 2.5994
Base strength of team [12 19  5 28 16 15]: 2.3597
==================================================
Game # 69 evaluation: 
Base strength of team [22 18 13 25  1 26 12  2]: 4.4364
Base strength of team [23  9 11 30 27  5 16 28]: 4.2188
==================================================
Game # 70 evaluation: 
Base strength of team [24 25 14 15  2 28 23]: 3.2972
Favorite pair boost applied for players (np.int64(15), np.int64(14)): +0.0544, total strength now 3.3516
Base strength of team [17 21  4  9 13 16 27]: 2.5592
Favorite pair boost applied for players (np.int64(16), np.int64(4)): +0.1072, total strength now 2.6664
==================================================
Game # 71 evaluation: 
Base strength of team [12 17 19  2 13 22 28 10 23]: 3.8643
Skills pair boost for players (np.int64(10), np.int64(12)): avg strength 0.4265 * boost factor 0.1377 = +0.0587, total strength now 3.9230
Skills pair boost for players (np.int64(10), np.int64(22)): avg strength 0.1564 * boost factor 0.1750 = +0.0274, total strength now 3.9503
Base strength of team [ 7  9 30 15 20 25 14  4 21]: 4.2396
Favorite pair boost applied for players (np.int64(30), np.int64(20)): +0.0999, total strength now 4.3395
Favorite pair boost applied for players (np.int64(15), np.int64(14)): +0.0544, total strength now 4.3938
Favorite pair boost applied for players (np.int64(7), np.int64(21)): +0.1383, total strength now 4.5322
Favorite triplet boost applied for players (np.int64(30), np.int64(4), np.int64(15)): +0.1562, total strength now 4.6884
==================================================
Game # 72 evaluation: 
Base strength of team [30  8  7 16 12 13 10 19]: 3.7357
Favorite pair boost applied for players (np.int64(16), np.int64(8)): +0.1269, total strength now 3.8625
Skills pair boost for players (np.int64(10), np.int64(12)): avg strength 0.4265 * boost factor 0.1377 = +0.0587, total strength now 3.9212
Base strength of team [11 14 17 18 15 21 29 22]: 2.7699
Favorite pair boost applied for players (np.int64(15), np.int64(14)): +0.0544, total strength now 2.8243
Favorite pair boost applied for players (np.int64(11), np.int64(15)): +0.1495, total strength now 2.9737
Favorite pair boost applied for players (np.int64(17), np.int64(14)): +0.0780, total strength now 3.0517
Skills triplet boost for players (np.int64(18), np.int64(15), np.int64(21)): avg strength 0.2516 * boost factor 0.1758 = +0.0442, total strength now 3.0959
==================================================
Game # 73 evaluation: 
Base strength of team [ 7  8 28  6 13 10]: 2.3507
Base strength of team [23 24 25  4 29 27]: 2.3243
==================================================
Game # 74 evaluation: 
Base strength of team [19 18 11  5  8 28 10]: 3.0632
Favorite pair boost applied for players (np.int64(18), np.int64(8)): +0.0970, total strength now 3.1602
Skills pair boost for players (np.int64(8), np.int64(11)): avg strength 0.7855 * boost factor 0.1778 = +0.1396, total strength now 3.2998
Base strength of team [29 16 13  2 17 23  9]: 2.8942
==================================================
Game # 75 evaluation: 
Base strength of team [25  2 22  1  5 24 21 13 11]: 4.6938
Favorite triplet boost applied for players (np.int64(5), np.int64(21), np.int64(24)): +0.1939, total strength now 4.8877
Base strength of team [26 15  7  4 12  9 18 10 29]: 3.4448
Favorite pair boost applied for players (np.int64(9), np.int64(10)): +0.1248, total strength now 3.5695
Skills pair boost for players (np.int64(10), np.int64(12)): avg strength 0.4265 * boost factor 0.1377 = +0.0587, total strength now 3.6283
==================================================
Game # 76 evaluation: 
Base strength of team [12 10 16  5 23]: 1.6796
Skills pair boost for players (np.int64(10), np.int64(12)): avg strength 0.4265 * boost factor 0.1377 = +0.0587, total strength now 1.7383
Base strength of team [18 29 27 30  7]: 2.4664
==================================================
Game # 77 evaluation: 
Base strength of team [11 16 19 20 10 14 26 30]: 3.1869
Favorite pair boost applied for players (np.int64(30), np.int64(20)): +0.0999, total strength now 3.2868
Skills pair boost for players (np.int64(20), np.int64(11)): avg strength 0.7909 * boost factor 0.1480 = +0.1171, total strength now 3.4039
Base strength of team [23 22 24  6 28 18 15  7]: 3.2466
==================================================
Game # 78 evaluation: 
Base strength of team [11 16 15 29 10 21 27]: 2.1783
Favorite pair boost applied for players (np.int64(11), np.int64(15)): +0.1495, total strength now 2.3278
Base strength of team [ 5 17 19 18 24 14  6]: 2.0999
Favorite pair boost applied for players (np.int64(17), np.int64(14)): +0.0780, total strength now 2.1779
==================================================
Game # 79 evaluation: 
Base strength of team [23 21  9 26  2 25 15  5  7]: 4.1363
Favorite pair boost applied for players (np.int64(7), np.int64(21)): +0.1383, total strength now 4.2747
Base strength of team [10  1 12  4 11 28 18 19 27]: 4.7595
Skills pair boost for players (np.int64(10), np.int64(12)): avg strength 0.4265 * boost factor 0.1377 = +0.0587, total strength now 4.8182
==================================================
Game # 80 evaluation: 
Base strength of team [10  2 12  4  6]: 1.7991
Skills pair boost for players (np.int64(10), np.int64(12)): avg strength 0.4265 * boost factor 0.1377 = +0.0587, total strength now 1.8578
Base strength of team [22 28 20 16 15]: 1.9841
Skills pair boost for players (np.int64(16), np.int64(22)): avg strength 0.2982 * boost factor 0.1985 = +0.0592, total strength now 2.0433
==================================================
Game # 81 evaluation: 
Base strength of team [11  6 10 12  9  7 21 24]: 4.0508
Favorite pair boost applied for players (np.int64(7), np.int64(21)): +0.1383, total strength now 4.1892
Favorite pair boost applied for players (np.int64(9), np.int64(10)): +0.1248, total strength now 4.3140
Skills pair boost for players (np.int64(10), np.int64(12)): avg strength 0.4265 * boost factor 0.1377 = +0.0587, total strength now 4.3727
Base strength of team [28  3 29 22 16  1 30  4]: 3.5482
Favorite pair boost applied for players (np.int64(16), np.int64(4)): +0.1072, total strength now 3.6554
Favorite triplet boost applied for players (np.int64(29), np.int64(16), np.int64(4)): +0.2285, total strength now 3.8839
Skills pair boost for players (np.int64(16), np.int64(22)): avg strength 0.2982 * boost factor 0.1985 = +0.0592, total strength now 3.9431
==================================================
Game # 82 evaluation: 
Base strength of team [21  8 17 16  1]: 2.5203
Favorite pair boost applied for players (np.int64(16), np.int64(8)): +0.1269, total strength now 2.6472
Base strength of team [22 20 15 23 26]: 1.6534
==================================================
Game # 83 evaluation: 
Base strength of team [14 29 22 16 30  1  8 19 12]: 4.1077
Favorite pair boost applied for players (np.int64(16), np.int64(8)): +0.1269, total strength now 4.2346
Skills pair boost for players (np.int64(16), np.int64(22)): avg strength 0.2982 * boost factor 0.1985 = +0.0592, total strength now 4.2938
Base strength of team [20  2 10 24 28  9  7 13  3]: 4.7982
Favorite pair boost applied for players (np.int64(9), np.int64(10)): +0.1248, total strength now 4.9229
==================================================
Game # 84 evaluation: 
Base strength of team [14  8 30 16 11]: 2.6646
Favorite pair boost applied for players (np.int64(16), np.int64(8)): +0.1269, total strength now 2.7915
Skills pair boost for players (np.int64(8), np.int64(11)): avg strength 0.7855 * boost factor 0.1778 = +0.1396, total strength now 2.9311
Base strength of team [20  6  9 18 17]: 2.3347
Skills pair boost for players (np.int64(20), np.int64(6)): avg strength 0.3350 * boost factor 0.1165 = +0.0390, total strength now 2.3737
==================================================
Game # 85 evaluation: 
Base strength of team [ 9  8 13 25 28 20  1 16  2]: 5.4979
Favorite pair boost applied for players (np.int64(16), np.int64(8)): +0.1269, total strength now 5.6248
Base strength of team [30 21  7  3 17 12 23 29  5]: 4.1379
Favorite pair boost applied for players (np.int64(7), np.int64(21)): +0.1383, total strength now 4.2762
==================================================
Game # 86 evaluation: 
Base strength of team [28 15  6 23  8 12 14 26]: 3.0153
Favorite pair boost applied for players (np.int64(15), np.int64(14)): +0.0544, total strength now 3.0697
Base strength of team [ 7 11 24  1 27 18 19 20]: 5.0921
Skills pair boost for players (np.int64(20), np.int64(11)): avg strength 0.7909 * boost factor 0.1480 = +0.1171, total strength now 5.2092
==================================================
Game # 87 evaluation: 
Base strength of team [21 13 20  1 16 14  9 12]: 3.9410
Base strength of team [10 15  2 26 11 27  6  4]: 2.8339
Favorite pair boost applied for players (np.int64(11), np.int64(15)): +0.1495, total strength now 2.9834
==================================================
Game # 88 evaluation: 
Base strength of team [20 18 25 13  5 10 15  2]: 3.1333
Base strength of team [ 3 30 22  7  8 29  1 11]: 4.9327
Skills pair boost for players (np.int64(8), np.int64(11)): avg strength 0.7855 * boost factor 0.1778 = +0.1396, total strength now 5.0723
==================================================
Game # 89 evaluation: 
Base strength of team [12 19  4 14 15 26 18 27]: 2.7908
Favorite pair boost applied for players (np.int64(15), np.int64(14)): +0.0544, total strength now 2.8451
Skills pair boost for players (np.int64(27), np.int64(14)): avg strength 0.3480 * boost factor 0.1411 = +0.0491, total strength now 2.8942
Base strength of team [10 24  2 30 23  5 11 28]: 3.9009
==================================================
Game # 90 evaluation: 
Base strength of team [30 16 11 12  6 21 10 26]: 3.1320
Skills pair boost for players (np.int64(30), np.int64(6)): avg strength 0.3328 * boost factor 0.1552 = +0.0516, total strength now 3.1836
Skills pair boost for players (np.int64(10), np.int64(12)): avg strength 0.4265 * boost factor 0.1377 = +0.0587, total strength now 3.2423
Base strength of team [13  5 29  4  1 23 22 14]: 2.3618
==================================================
Game # 91 evaluation: 
Base strength of team [10 15  5  9 16]: 1.3723
Favorite pair boost applied for players (np.int64(9), np.int64(10)): +0.1248, total strength now 1.4971
Base strength of team [27 21 12  7  1]: 3.3031
Favorite pair boost applied for players (np.int64(7), np.int64(21)): +0.1383, total strength now 3.4414
==================================================
Game # 92 evaluation: 
Base strength of team [20 10  4 11 27 30  6 19 18]: 3.6614
Favorite pair boost applied for players (np.int64(30), np.int64(20)): +0.0999, total strength now 3.7613
Skills pair boost for players (np.int64(30), np.int64(6)): avg strength 0.3328 * boost factor 0.1552 = +0.0516, total strength now 3.8130
Skills pair boost for players (np.int64(20), np.int64(6)): avg strength 0.3350 * boost factor 0.1165 = +0.0390, total strength now 3.8520
Skills pair boost for players (np.int64(20), np.int64(11)): avg strength 0.7909 * boost factor 0.1480 = +0.1171, total strength now 3.9691
Skills triplet boost for players (np.int64(10), np.int64(4), np.int64(20)): avg strength 0.2628 * boost factor 0.2501 = +0.0657, total strength now 4.0348
Base strength of team [12 16 25 29 13  5 22  9 17]: 3.8616
Skills pair boost for players (np.int64(16), np.int64(22)): avg strength 0.2982 * boost factor 0.1985 = +0.0592, total strength now 3.9208
==================================================
Game # 93 evaluation: 
Base strength of team [10  3  7 23  6 25]: 2.6950
Base strength of team [ 5 30 26 19  1  9]: 2.9132
==================================================
Game # 94 evaluation: 
Base strength of team [28  7 13 10 29]: 1.7380
Base strength of team [20 14 27 25 21]: 2.2326
Skills pair boost for players (np.int64(27), np.int64(14)): avg strength 0.3480 * boost factor 0.1411 = +0.0491, total strength now 2.2817
==================================================
Game # 95 evaluation: 
Base strength of team [ 8 30  6  7 16  2 18]: 3.6011
Favorite pair boost applied for players (np.int64(16), np.int64(8)): +0.1269, total strength now 3.7280
Favorite pair boost applied for players (np.int64(18), np.int64(8)): +0.0970, total strength now 3.8250
Skills pair boost for players (np.int64(30), np.int64(6)): avg strength 0.3328 * boost factor 0.1552 = +0.0516, total strength now 3.8766
Base strength of team [17 27  3 26 20 14  9]: 3.3391
Favorite pair boost applied for players (np.int64(17), np.int64(14)): +0.0780, total strength now 3.4170
Skills pair boost for players (np.int64(27), np.int64(14)): avg strength 0.3480 * boost factor 0.1411 = +0.0491, total strength now 3.4661
==================================================
Game # 96 evaluation: 
Base strength of team [25 23 21 24 17 28  1 10]: 3.8356
Base strength of team [ 8 11  6 27 20  2  4 19]: 3.9344
Skills pair boost for players (np.int64(20), np.int64(6)): avg strength 0.3350 * boost factor 0.1165 = +0.0390, total strength now 3.9735
Skills pair boost for players (np.int64(8), np.int64(11)): avg strength 0.7855 * boost factor 0.1778 = +0.1396, total strength now 4.1131
Skills pair boost for players (np.int64(20), np.int64(11)): avg strength 0.7909 * boost factor 0.1480 = +0.1171, total strength now 4.2302
==================================================
Game # 97 evaluation: 
Base strength of team [16 30  7 20 15 23  2 17]: 4.1963
Favorite pair boost applied for players (np.int64(30), np.int64(20)): +0.0999, total strength now 4.2963
Base strength of team [25 19  5 10 14 27 24  9]: 3.1132
Favorite pair boost applied for players (np.int64(9), np.int64(10)): +0.1248, total strength now 3.2380
Skills pair boost for players (np.int64(27), np.int64(14)): avg strength 0.3480 * boost factor 0.1411 = +0.0491, total strength now 3.2871
==================================================
Game # 98 evaluation: 
Base strength of team [ 7 21  9 26 18 13 29]: 2.6042
Favorite pair boost applied for players (np.int64(7), np.int64(21)): +0.1383, total strength now 2.7425
Base strength of team [ 2  5 23 24 14 10 30]: 2.5204
==================================================
Game # 99 evaluation: 
Base strength of team [ 2 19 10 17 22]: 1.8607
Skills pair boost for players (np.int64(10), np.int64(22)): avg strength 0.1564 * boost factor 0.1750 = +0.0274, total strength now 1.8881
Base strength of team [ 6  5 24 14 18]: 1.2839
==================================================
player_strengths shape: (31,)
teamA_data shape: (100, 9)
teamB_data shape: (100, 9)
outcomes shape: (100,)
Game 0:
 Team A players:  [ 4  8 19 24 23  3 26  7 25]
 Team A palyers' stregths:  [0.15601864 0.60111501 0.29122914 0.45606998 0.36636184 0.59865848
 0.19967378 0.86617615 0.78517596]
 Team B players:  [30 15 16 27 11 13 29 20 17]
Team B strengths 3.9747347450429076
 Team B palyers' stregths:  [0.60754485 0.18340451 0.30424224 0.51423444 0.96990985 0.21233911
 0.04645041 0.61185289 0.52475643]
 Label (Team A wins=1): 0.04070681695606775
Game 1:
 Team A players:  [16  9 21  4  5  1  8 17  6]
 Team A palyers' stregths:  [0.30424224 0.70807258 0.13949386 0.15601864 0.15599452 0.95071431
 0.60111501 0.52475643 0.05808361]
 Team B players:  [10 26 12 11 30 18 28  3  7]
Team B strengths 5.119349838794148
 Team B palyers' stregths:  [0.02058449 0.19967378 0.83244264 0.96990985 0.60754485 0.43194502
 0.59241457 0.59865848 0.86617615]
 Label (Team A wins=1): -1.2697718272631153
Game 2:
 Team A players:  [28 29 24  5 18  0  0  0  0]
 Team A palyers' stregths:  [0.59241457 0.04645041 0.45606998 0.15599452 0.43194502 0.37454012
 0.37454012 0.37454012 0.37454012]
 Team B players:  [27 19 14  1 11  0  0  0  0]
Team B strengths 4.4060731797801145
 Team B palyers' stregths:  [0.51423444 0.29122914 0.18182497 0.95071431 0.96990985 0.37454012
 0.37454012 0.37454012 0.37454012]
 Label (Team A wins=1): -1.181536314741586

Building a model#

# Constants (adjust as needed)
PLAYER_EMB_DIM = 32
# NUM_CLASSES = 1  # Binary: win/loss

# Inputs: variable-length teams
teamA_input = Input(shape=(None, ), dtype='int32', name='teamA')  # variable-length
teamB_input = Input(shape=(None, ), dtype='int32', name='teamB')  # variable-length
# teamB_input = Input(batch_shape=(80, 9), dtype='int32', name='teamB')  # variable-length

# Embedding layer with mask support
player_embedding = layers.Embedding(
    input_dim=NUM_PLAYERS + 1,
    output_dim=PLAYER_EMB_DIM,
    embeddings_initializer=initializers.GlorotUniform(seed=seed_value),
    mask_zero=True,  # Important: enables automatic masking for padding (0 as pad token)
    # embeddings_regularizer=tf.keras.regularizers.l2(1e-4),
    name='player_embedding'
)

# Embed team players
teamA_embeds = player_embedding(teamA_input)  # shape: (batch, teamA_len, emb_dim)
teamB_embeds = player_embedding(teamB_input)

#Self-attention block (respects masks automatically if using Functional API)
def self_attention_block(x, name_prefix=''):
    attn_output = layers.MultiHeadAttention(
        num_heads=4,
        key_dim=PLAYER_EMB_DIM,
        dropout=0.1,
        name=f'{name_prefix}_attn'
    )(x, x)
    x = layers.Add(name=f'{name_prefix}_residual')([x, attn_output])
    x = layers.LayerNormalization(name=f'{name_prefix}_norm')(x)
    return x

# # Apply attention
teamA_attn = self_attention_block(teamA_embeds, 'teamA')
teamB_attn = self_attention_block(teamB_embeds, 'teamB')

# Global average pooling over valid (non-padded) tokens
# TF handles masking automatically in GlobalAveragePooling1D if mask_zero=True
teamA_vector = layers.GlobalAveragePooling1D(name='teamA_avgpool')(teamA_attn)
teamB_vector = layers.GlobalAveragePooling1D(name='teamB_avgpool')(teamB_attn)
teamA_vector = teamA_vector
teamB_vector = teamB_vector 

# Matchup modeling (difference vector)
matchup_vector = layers.Subtract(name='matchup_diff')([teamA_vector, teamB_vector])

# Concatenate summary representation
match_input = layers.Concatenate(name='match_features')([teamA_vector, teamB_vector, matchup_vector])
# match_input = layers.Concatenate(name='match_features')([teamA_vector, teamB_vector])

# Feedforward classification head
x = layers.Dense(64, activation='relu', kernel_regularizer=tf.keras.regularizers.l2(1e-4))(match_input)
x = layers.Dropout(0.3)(x)
x = layers.Dense(32, activation='relu', kernel_regularizer=tf.keras.regularizers.l2(1e-4))(x)
x = layers.Dropout(0.3)(x)
output = layers.Dense(1, activation='linear', name='regression_output')(x)

# Final model
model = Model(inputs=[teamA_input, teamB_input], outputs=output)
model.compile(optimizer='adam', 
                loss='mean_squared_error',   # or 'mean_absolute_error'
                metrics=['mean_absolute_error']
                )

model.summary()
2025-10-31 12:34:44.306459: E external/local_xla/xla/stream_executor/cuda/cuda_driver.cc:152] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
Model: "functional"
┏━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━┓
┃ Layer (type)         Output Shape          Param #  Connected to      ┃
┡━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━┩
│ teamA (InputLayer)  │ (None, None)      │          0 │ -                 │
├─────────────────────┼───────────────────┼────────────┼───────────────────┤
│ teamB (InputLayer)  │ (None, None)      │          0 │ -                 │
├─────────────────────┼───────────────────┼────────────┼───────────────────┤
│ player_embedding    │ (None, None, 32)  │        992 │ teamA[0][0],      │
│ (Embedding)         │                   │            │ teamB[0][0]       │
├─────────────────────┼───────────────────┼────────────┼───────────────────┤
│ teamA_attn          │ (None, None, 32)  │     16,800 │ player_embedding… │
│ (MultiHeadAttentio… │                   │            │ player_embedding… │
├─────────────────────┼───────────────────┼────────────┼───────────────────┤
│ teamB_attn          │ (None, None, 32)  │     16,800 │ player_embedding… │
│ (MultiHeadAttentio… │                   │            │ player_embedding… │
├─────────────────────┼───────────────────┼────────────┼───────────────────┤
│ teamA_residual      │ (None, None, 32)  │          0 │ player_embedding… │
│ (Add)               │                   │            │ teamA_attn[0][0]  │
├─────────────────────┼───────────────────┼────────────┼───────────────────┤
│ not_equal           │ (None, None)      │          0 │ teamA[0][0]       │
│ (NotEqual)          │                   │            │                   │
├─────────────────────┼───────────────────┼────────────┼───────────────────┤
│ teamB_residual      │ (None, None, 32)  │          0 │ player_embedding… │
│ (Add)               │                   │            │ teamB_attn[0][0]  │
├─────────────────────┼───────────────────┼────────────┼───────────────────┤
│ not_equal_1         │ (None, None)      │          0 │ teamB[0][0]       │
│ (NotEqual)          │                   │            │                   │
├─────────────────────┼───────────────────┼────────────┼───────────────────┤
│ teamA_norm          │ (None, None, 32)  │         64 │ teamA_residual[0… │
│ (LayerNormalizatio… │                   │            │                   │
├─────────────────────┼───────────────────┼────────────┼───────────────────┤
│ logical_or          │ (None, None)      │          0 │ not_equal[0][0],  │
│ (LogicalOr)         │                   │            │ not_equal[0][0]   │
├─────────────────────┼───────────────────┼────────────┼───────────────────┤
│ teamB_norm          │ (None, None, 32)  │         64 │ teamB_residual[0… │
│ (LayerNormalizatio… │                   │            │                   │
├─────────────────────┼───────────────────┼────────────┼───────────────────┤
│ logical_or_1        │ (None, None)      │          0 │ not_equal_1[0][0… │
│ (LogicalOr)         │                   │            │ not_equal_1[0][0] │
├─────────────────────┼───────────────────┼────────────┼───────────────────┤
│ teamA_avgpool       │ (None, 32)        │          0 │ teamA_norm[0][0], │
│ (GlobalAveragePool… │                   │            │ logical_or[0][0]  │
├─────────────────────┼───────────────────┼────────────┼───────────────────┤
│ teamB_avgpool       │ (None, 32)        │          0 │ teamB_norm[0][0], │
│ (GlobalAveragePool… │                   │            │ logical_or_1[0][ │
├─────────────────────┼───────────────────┼────────────┼───────────────────┤
│ matchup_diff        │ (None, 32)        │          0 │ teamA_avgpool[0]… │
│ (Subtract)          │                   │            │ teamB_avgpool[0]… │
├─────────────────────┼───────────────────┼────────────┼───────────────────┤
│ match_features      │ (None, 96)        │          0 │ teamA_avgpool[0]… │
│ (Concatenate)       │                   │            │ teamB_avgpool[0]… │
│                     │                   │            │ matchup_diff[0][ │
├─────────────────────┼───────────────────┼────────────┼───────────────────┤
│ dense (Dense)       │ (None, 64)        │      6,208 │ match_features[0… │
├─────────────────────┼───────────────────┼────────────┼───────────────────┤
│ dropout_2 (Dropout) │ (None, 64)        │          0 │ dense[0][0]       │
├─────────────────────┼───────────────────┼────────────┼───────────────────┤
│ dense_1 (Dense)     │ (None, 32)        │      2,080 │ dropout_2[0][0]   │
├─────────────────────┼───────────────────┼────────────┼───────────────────┤
│ dropout_3 (Dropout) │ (None, 32)        │          0 │ dense_1[0][0]     │
├─────────────────────┼───────────────────┼────────────┼───────────────────┤
│ regression_output   │ (None, 1)         │         33 │ dropout_3[0][0]   │
│ (Dense)             │                   │            │                   │
└─────────────────────┴───────────────────┴────────────┴───────────────────┘
 Total params: 43,041 (168.13 KB)
 Trainable params: 43,041 (168.13 KB)
 Non-trainable params: 0 (0.00 B)

Training the model#

import numpy as np
from sklearn.model_selection import train_test_split

# Assume the following arrays from your dataset generation code:
# teamA_data, teamB_data, labels (all np arrays)

# 1. Train-validation split (80% train, 20% validation)
X_trainA, X_valA, X_trainB, X_valB, y_train, y_val = train_test_split(
    teamA_data, teamB_data, outcomes, test_size=0.2, random_state=42
)

# 2. Build or import your Keras model (reuse the model creation code from before)
# For example, let's say you have your variable-size team transformer model as 'model'

# 3. Optional: callbacks for monitoring
from tensorflow.keras.callbacks import EarlyStopping

early_stop = EarlyStopping(monitor='val_loss', patience=5, restore_best_weights=True)

# Define a learning rate schedule function (step decay example)
def lr_schedule(epoch, lr):
    drop_rate = 0.5
    epochs_drop = 10
    if epoch > 0 and epoch % epochs_drop == 0:
        return lr * drop_rate
    return lr

#Instantiate callback
lr_scheduler = tf.keras.callbacks.LearningRateScheduler(lr_schedule)

# Or adaptive reduction on plateau (reduce LR when val_loss stalls)
reduce_lr = tf.keras.callbacks.ReduceLROnPlateau(monitor='val_loss', factor=0.5,
                                                 patience=5, min_lr=1e-6)


player_embedding_layer = model.get_layer("player_embedding")
embeddings_before_fold = player_embedding_layer.get_weights()[0]
print("Embedding vector for player zero before training:", embeddings_before_fold[0])

print(X_trainA)

# 4. Train the model
history = model.fit(
    [X_trainA, X_trainB],                 # Inputs as a list
    y_train,                              # Targets
    epochs=50,
    batch_size=32,
    validation_data=([X_valA, X_valB], y_val),
    callbacks=[lr_scheduler, reduce_lr, early_stop],
    # callbacks=[early_stop]
)

pe_layer = model.get_layer("player_embedding")
embeddings_after_fold = pe_layer.get_weights()[0]
print("Embedding vector for player zero after training:", embeddings_after_fold[0])

change = np.linalg.norm(embeddings_after_fold[0] - embeddings_before_fold[0])
print("Change in player zero embedding vector:", change)

# 5. Evaluate model performance
loss, accuracy = model.evaluate([X_valA, X_valB], y_val)
print(f"Validation accuracy: {accuracy:.3f}")
Embedding vector for player zero before training: [-0.13428518 -0.0009771   0.1329839  -0.29613775 -0.26009488  0.07811415
  0.15738839 -0.20170173 -0.22916353 -0.27609777  0.27313498 -0.16636868
 -0.01386166 -0.1564296   0.13861132  0.16607106  0.02278191 -0.20447966
  0.00173342 -0.18519947 -0.28682098 -0.11555447 -0.01580417  0.03690901
  0.1996915   0.05639273 -0.11093217  0.16203171 -0.23099054  0.06050453
 -0.12172711  0.21778926]
[[15 19 17 25 24  7 30 14  0]
 [20 18 25 13  5 10 15  2  0]
 [15 27 12 29 19  2 14  0  0]
 [13 29 18 24  4 30 16 20  0]
 [22 18 13 25  1 26 12  2  0]
 [ 7  8 21 28 25 13  0  0  0]
 [ 6 25 13  7  2 16  0  0  0]
 [25 23 21 24 17 28  1 10  0]
 [ 7 16 19 28 21 17  3  4  0]
 [30  8  7 16 12 13 10 19  0]
 [19 27 10 15 13 16  1  5 14]
 [ 2 18 12 21 25 26  0  0  0]
 [ 9  8 13 25 28 20  1 16  2]
 [ 5 27 19 21  9  0  0  0  0]
 [10  3  7 23  6 25  0  0  0]
 [16 22 17 14 21 30 29  7 27]
 [ 5 27  1 11 23  2 19 30  0]
 [18 15 16 11 12 25  9  8  1]
 [10  1  8 28 30 11 13 23  3]
 [ 3 15 21 10 25  0  0  0  0]
 [11 10 21 28  2  7  0  0  0]
 [11  1 17  7 21 20 10  0  0]
 [11 20 21  9 27  5 10  0  0]
 [ 8 30  6  7 16  2 18  0  0]
 [23 11  1 22 18 15 21 25 26]
 [24 16  4 23 15 13 20  0  0]
 [11  6 10 12  9  7 21 24  0]
 [20  7  4 14 26  9 21 30 10]
 [16  2 17 19 23 21 11  0  0]
 [ 9 23 13 25 11  0  0  0  0]
 [12 21  7  1 14  2 30 13  4]
 [26 10 12 17 14 20 18  0  0]
 [28 11 20 18 14  9  0  0  0]
 [16  7 18  8  3  1 10 22  0]
 [ 3  7 21 17 15 11  9  0  0]
 [11 16 15 29 10 21 27  0  0]
 [ 4  7 20 16 27  6 30 13  0]
 [23 24 27 12 28  8 29  3  0]
 [22 11 23 26 27 29  0  0  0]
 [12 19  4 14 15 26 18 27  0]
 [ 4 24  6 25  9 16  0  0  0]
 [ 2 19 10 17 22  0  0  0  0]
 [28 19  4 11  9 13  0  0  0]
 [19  8  1  7 27 25 10  2  0]
 [24 26  5 27 28  0  0  0  0]
 [ 2  9  6 26 23  4  0  0  0]
 [14 29  8 30 25 13  0  0  0]
 [ 8  3 29 18 20 13  0  0  0]
 [ 8 15 24 29 28  3  0  0  0]
 [16 30  7 20 15 23  2 17  0]
 [23 21  9 26  2 25 15  5  7]
 [12 17 10  8 18  0  0  0  0]
 [28  5 14 22 27  0  0  0  0]
 [29 11 15 10  9 23  4 13  0]
 [ 7 21  9 26 18 13 29  0  0]
 [21 18 17  4 14  3 24 26  0]
 [25  2 22  1  5 24 21 13 11]
 [29 12  3  5 28 21 11 24  0]
 [28  7 13 10 29  0  0  0  0]
 [16  2  5 19  4  0  0  0  0]
 [10  8 20  5 21  9  0  0  0]
 [14  8 30 16 11  0  0  0  0]
 [22 18  2  7  6 19 27 20 29]
 [ 7 10  8 13 25 27  1 18 19]
 [16  9 21  4  5  1  8 17  6]
 [ 7  1 29 11 10  0  0  0  0]
 [14 18 20 25  9  6 23  0  0]
 [28 29 24  5 18  0  0  0  0]
 [16 22 10 18 30 15  9  1  0]
 [21 13 20  1 16 14  9 12  0]
 [10 15  5  9 16  0  0  0  0]
 [19 18 11  5  8 28 10  0  0]
 [28 15  6 23  8 12 14 26  0]
 [21  8 17 16  1  0  0  0  0]
 [ 1 24 11 28  4  0  0  0  0]
 [22 10 14 16  4 26  0  0  0]
 [12 17 19  2 13 22 28 10 23]
 [21 13 28 18  7 23  9  0  0]
 [20 10  4 11 27 30  6 19 18]
 [26  8 21 12 23  5  1  0  0]]
Epoch 1/50
2025-10-31 12:34:44.522753: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
1/3 ━━━━━━━━━━━━━━━━━━━━ 7s 4s/step - loss: 1.3557 - mean_absolute_error: 0.9303

3/3 ━━━━━━━━━━━━━━━━━━━━ 4s 200ms/step - loss: 1.2881 - mean_absolute_error: 0.9244 - val_loss: 1.1137 - val_mean_absolute_error: 0.8305 - learning_rate: 0.0010
Epoch 2/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 1.1848 - mean_absolute_error: 0.8136

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 1.2050 - mean_absolute_error: 0.8624 - val_loss: 1.0625 - val_mean_absolute_error: 0.8088 - learning_rate: 0.0010
Epoch 3/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 1.1106 - mean_absolute_error: 0.7927

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 1.0405 - mean_absolute_error: 0.7834 - val_loss: 1.0326 - val_mean_absolute_error: 0.7945 - learning_rate: 0.0010
Epoch 4/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.9526 - mean_absolute_error: 0.7229

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.9221 - mean_absolute_error: 0.7432 - val_loss: 1.0226 - val_mean_absolute_error: 0.7906 - learning_rate: 0.0010
Epoch 5/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.9425 - mean_absolute_error: 0.7056

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.9070 - mean_absolute_error: 0.7338 - val_loss: 1.0179 - val_mean_absolute_error: 0.7868 - learning_rate: 0.0010
Epoch 6/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.9145 - mean_absolute_error: 0.6780

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - loss: 0.8270 - mean_absolute_error: 0.7041 - val_loss: 0.9930 - val_mean_absolute_error: 0.7741 - learning_rate: 0.0010
Epoch 7/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.9592 - mean_absolute_error: 0.7422

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - loss: 0.8183 - mean_absolute_error: 0.6997 - val_loss: 0.9514 - val_mean_absolute_error: 0.7529 - learning_rate: 0.0010
Epoch 8/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.7137 - mean_absolute_error: 0.6123

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - loss: 0.6684 - mean_absolute_error: 0.6114 - val_loss: 0.9067 - val_mean_absolute_error: 0.7315 - learning_rate: 0.0010
Epoch 9/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.7740 - mean_absolute_error: 0.6560

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.6790 - mean_absolute_error: 0.6281 - val_loss: 0.8652 - val_mean_absolute_error: 0.7169 - learning_rate: 0.0010
Epoch 10/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.7091 - mean_absolute_error: 0.6217

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.6377 - mean_absolute_error: 0.5745 - val_loss: 0.8078 - val_mean_absolute_error: 0.6949 - learning_rate: 0.0010
Epoch 11/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.6153 - mean_absolute_error: 0.6081

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.5793 - mean_absolute_error: 0.5998 - val_loss: 0.7801 - val_mean_absolute_error: 0.6914 - learning_rate: 5.0000e-04
Epoch 12/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.4765 - mean_absolute_error: 0.5098

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - loss: 0.5911 - mean_absolute_error: 0.5823 - val_loss: 0.7561 - val_mean_absolute_error: 0.6847 - learning_rate: 5.0000e-04
Epoch 13/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.4296 - mean_absolute_error: 0.5430

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.5197 - mean_absolute_error: 0.5862 - val_loss: 0.7140 - val_mean_absolute_error: 0.6694 - learning_rate: 5.0000e-04
Epoch 14/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.4422 - mean_absolute_error: 0.5066

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - loss: 0.4341 - mean_absolute_error: 0.5168 - val_loss: 0.6701 - val_mean_absolute_error: 0.6544 - learning_rate: 5.0000e-04
Epoch 15/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.3880 - mean_absolute_error: 0.5041

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.3889 - mean_absolute_error: 0.4921 - val_loss: 0.6208 - val_mean_absolute_error: 0.6388 - learning_rate: 5.0000e-04
Epoch 16/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.4060 - mean_absolute_error: 0.5013

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - loss: 0.3596 - mean_absolute_error: 0.4514 - val_loss: 0.5808 - val_mean_absolute_error: 0.6216 - learning_rate: 5.0000e-04
Epoch 17/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.4670 - mean_absolute_error: 0.5333

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.3866 - mean_absolute_error: 0.4711 - val_loss: 0.5546 - val_mean_absolute_error: 0.6167 - learning_rate: 5.0000e-04
Epoch 18/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.5359 - mean_absolute_error: 0.5487

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.4335 - mean_absolute_error: 0.4895 - val_loss: 0.5448 - val_mean_absolute_error: 0.6084 - learning_rate: 5.0000e-04
Epoch 19/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.4063 - mean_absolute_error: 0.5224

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - loss: 0.4436 - mean_absolute_error: 0.5386 - val_loss: 0.5687 - val_mean_absolute_error: 0.6052 - learning_rate: 5.0000e-04
Epoch 20/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.5185 - mean_absolute_error: 0.5332

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - loss: 0.3953 - mean_absolute_error: 0.4660 - val_loss: 0.6201 - val_mean_absolute_error: 0.6180 - learning_rate: 5.0000e-04
Epoch 21/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.4609 - mean_absolute_error: 0.5536

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - loss: 0.4088 - mean_absolute_error: 0.5102 - val_loss: 0.6131 - val_mean_absolute_error: 0.6152 - learning_rate: 2.5000e-04
Epoch 22/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.4047 - mean_absolute_error: 0.5080

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - loss: 0.3802 - mean_absolute_error: 0.4924 - val_loss: 0.5773 - val_mean_absolute_error: 0.5989 - learning_rate: 2.5000e-04
Epoch 23/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.3631 - mean_absolute_error: 0.4250

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.3643 - mean_absolute_error: 0.4578 - val_loss: 0.5427 - val_mean_absolute_error: 0.5896 - learning_rate: 2.5000e-04
Epoch 24/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.4449 - mean_absolute_error: 0.5040

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.3513 - mean_absolute_error: 0.4520 - val_loss: 0.5232 - val_mean_absolute_error: 0.5849 - learning_rate: 2.5000e-04
Epoch 25/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.4191 - mean_absolute_error: 0.5346

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.3592 - mean_absolute_error: 0.4837 - val_loss: 0.5134 - val_mean_absolute_error: 0.5788 - learning_rate: 2.5000e-04
Epoch 26/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2815 - mean_absolute_error: 0.4228

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.2419 - mean_absolute_error: 0.3705 - val_loss: 0.5086 - val_mean_absolute_error: 0.5730 - learning_rate: 2.5000e-04
Epoch 27/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.5226 - mean_absolute_error: 0.4887

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.4344 - mean_absolute_error: 0.4765 - val_loss: 0.5054 - val_mean_absolute_error: 0.5689 - learning_rate: 2.5000e-04
Epoch 28/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2873 - mean_absolute_error: 0.4129

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.2802 - mean_absolute_error: 0.4163 - val_loss: 0.5032 - val_mean_absolute_error: 0.5652 - learning_rate: 2.5000e-04
Epoch 29/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.3079 - mean_absolute_error: 0.4168

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - loss: 0.3106 - mean_absolute_error: 0.4271 - val_loss: 0.5033 - val_mean_absolute_error: 0.5625 - learning_rate: 2.5000e-04
Epoch 30/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.3105 - mean_absolute_error: 0.4512

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - loss: 0.2656 - mean_absolute_error: 0.3970 - val_loss: 0.4960 - val_mean_absolute_error: 0.5571 - learning_rate: 2.5000e-04
Epoch 31/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.4017 - mean_absolute_error: 0.4361

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.3646 - mean_absolute_error: 0.4431 - val_loss: 0.4912 - val_mean_absolute_error: 0.5542 - learning_rate: 1.2500e-04
Epoch 32/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.3000 - mean_absolute_error: 0.4050

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - loss: 0.3913 - mean_absolute_error: 0.4575 - val_loss: 0.4893 - val_mean_absolute_error: 0.5513 - learning_rate: 1.2500e-04
Epoch 33/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.4499 - mean_absolute_error: 0.5023

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.3442 - mean_absolute_error: 0.4327 - val_loss: 0.4882 - val_mean_absolute_error: 0.5488 - learning_rate: 1.2500e-04
Epoch 34/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.4591 - mean_absolute_error: 0.5302

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.3274 - mean_absolute_error: 0.4340 - val_loss: 0.4857 - val_mean_absolute_error: 0.5462 - learning_rate: 1.2500e-04
Epoch 35/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.3753 - mean_absolute_error: 0.4626

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.2929 - mean_absolute_error: 0.3923 - val_loss: 0.4824 - val_mean_absolute_error: 0.5435 - learning_rate: 1.2500e-04
Epoch 36/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.4467 - mean_absolute_error: 0.5184

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.3293 - mean_absolute_error: 0.4358 - val_loss: 0.4779 - val_mean_absolute_error: 0.5410 - learning_rate: 1.2500e-04
Epoch 37/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.3202 - mean_absolute_error: 0.4359

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.3144 - mean_absolute_error: 0.4290 - val_loss: 0.4734 - val_mean_absolute_error: 0.5382 - learning_rate: 1.2500e-04
Epoch 38/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - loss: 0.2336 - mean_absolute_error: 0.4206

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.2446 - mean_absolute_error: 0.3943 - val_loss: 0.4674 - val_mean_absolute_error: 0.5351 - learning_rate: 1.2500e-04
Epoch 39/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - loss: 0.2624 - mean_absolute_error: 0.4166

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - loss: 0.3144 - mean_absolute_error: 0.4285 - val_loss: 0.4611 - val_mean_absolute_error: 0.5318 - learning_rate: 1.2500e-04
Epoch 40/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.2081 - mean_absolute_error: 0.3497

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - loss: 0.2467 - mean_absolute_error: 0.3769 - val_loss: 0.4557 - val_mean_absolute_error: 0.5286 - learning_rate: 1.2500e-04
Epoch 41/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.3156 - mean_absolute_error: 0.4012

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - loss: 0.3243 - mean_absolute_error: 0.4253 - val_loss: 0.4541 - val_mean_absolute_error: 0.5274 - learning_rate: 6.2500e-05
Epoch 42/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.2919 - mean_absolute_error: 0.4238

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - loss: 0.3039 - mean_absolute_error: 0.4166 - val_loss: 0.4531 - val_mean_absolute_error: 0.5262 - learning_rate: 6.2500e-05
Epoch 43/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1735 - mean_absolute_error: 0.3326

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step - loss: 0.2125 - mean_absolute_error: 0.3519 - val_loss: 0.4529 - val_mean_absolute_error: 0.5252 - learning_rate: 6.2500e-05
Epoch 44/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.2632 - mean_absolute_error: 0.3957

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - loss: 0.2608 - mean_absolute_error: 0.3910 - val_loss: 0.4544 - val_mean_absolute_error: 0.5245 - learning_rate: 6.2500e-05
Epoch 45/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2503 - mean_absolute_error: 0.3682

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - loss: 0.2653 - mean_absolute_error: 0.3655 - val_loss: 0.4546 - val_mean_absolute_error: 0.5236 - learning_rate: 6.2500e-05
Epoch 46/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2527 - mean_absolute_error: 0.3469

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - loss: 0.2824 - mean_absolute_error: 0.3900 - val_loss: 0.4549 - val_mean_absolute_error: 0.5228 - learning_rate: 6.2500e-05
Epoch 47/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2627 - mean_absolute_error: 0.4265

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - loss: 0.2960 - mean_absolute_error: 0.4470 - val_loss: 0.4546 - val_mean_absolute_error: 0.5221 - learning_rate: 6.2500e-05
Epoch 48/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2362 - mean_absolute_error: 0.4126

3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - loss: 0.2573 - mean_absolute_error: 0.3974 - val_loss: 0.4557 - val_mean_absolute_error: 0.5216 - learning_rate: 6.2500e-05
Embedding vector for player zero after training: [-0.13428518 -0.0009771   0.1329839  -0.29613775 -0.26009488  0.07811415
  0.15738839 -0.20170173 -0.22916353 -0.27609777  0.27313498 -0.16636868
 -0.01386166 -0.1564296   0.13861132  0.16607106  0.02278191 -0.20447966
  0.00173342 -0.18519947 -0.28682098 -0.11555447 -0.01580417  0.03690901
  0.1996915   0.05639273 -0.11093217  0.16203171 -0.23099054  0.06050453
 -0.12172711  0.21778926]
Change in player zero embedding vector: 0.0
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.4529 - mean_absolute_error: 0.5252

1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.4529 - mean_absolute_error: 0.5252
Validation accuracy: 0.525

2025-10-31 12:34:52.222112: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
import matplotlib.pyplot as plt

# Assume 'history' is the object returned by your call to model.fit(...)

# Plot training and validation accuracy
plt.figure(figsize=(14, 5))

plt.subplot(1, 2, 1)
plt.plot(history.history['mean_absolute_error'], label='Training MAE')
plt.plot(history.history['val_mean_absolute_error'], label='Validation MAE')
plt.xlabel('Epoch')
plt.ylabel('mean_absolute_error')
plt.title('Training vs Validation MAE')
plt.legend()

# Plot training and validation loss
plt.subplot(1, 2, 2)
plt.plot(history.history['loss'], label='Training Loss')
plt.plot(history.history['val_loss'], label='Validation Loss')
plt.xlabel('Epoch')
plt.ylabel('Loss')
plt.title('Training vs Validation Loss')
plt.legend()

plt.tight_layout()
plt.show()
_images/1e5ab52e710e8a83c0ad87ee234175e079404ee26dd22820acf0fe3c945bc5e0.png

Extracting embeddings of players#

# Direct reference to layer
player_embeddings = player_embedding.get_weights()[0]
print(player_embeddings.shape)  # (NUM_PLAYERS, PLAYER_EMB_DIM)
(31, 32)

UMAP projection and visualization of players in 3D#

import numpy as np
import umap
import matplotlib.pyplot as plt
import plotly.graph_objs as go

# Needed the following two lines to make pltoly chart rendered in the book
import plotly.io as pio
pio.renderers.default = 'notebook'



def umap_and_visualize(player_embeddings, player_strengths, marker_labels = None,  random_state = None):
    
    # Embed to 3D
    reducer = umap.UMAP(n_components=3, random_state=random_state, n_jobs=-1)
    embeddings_3d = reducer.fit_transform(player_embeddings)
    
    
    labels_points = marker_labels if marker_labels else [f'Player {idx}:{name}' for idx, name in enumerate(player_strengths)]

    # Create interactive 3D scatter plot
    fig = go.Figure(
        data=[go.Scatter3d(
            x=embeddings_3d[:, 0],
            y=embeddings_3d[:, 1],
            z=embeddings_3d[:, 2],
            mode='markers',
            marker=dict(
                size=7,
                color=player_strengths,       # Color by this array
                colorscale='Viridis',         # Choose a colorscale
                colorbar=dict(title='Strength'),
                opacity=0.8
            ),
            text=labels_points,          # Hover labels
            hoverinfo='text'
        )]
    )

    fig.update_layout(
        title="3D UMAP projection of player embeddings",
        width=1000,             # <-- Change this to your desired width in pixels
        height=800,  
        scene=dict(
            xaxis_title="UMAP-1",
            yaxis_title="UMAP-2",
            zaxis_title="UMAP-3"
        )
    )
    print("")
    return fig, embeddings_3d

3D visualization of players’ projected embeddings#

fig, embeddings_3d = umap_and_visualize(player_embeddings, player_strengths)
fig

Calculate correlation of the embeddings with the original base strengths#

import numpy as np
from scipy.stats import pearsonr

def  compute_correlations_for_projected_dims(embeddings_nd, player_strengths):

    """
    Compute the Pearson correlation coefficients between N-D embedding dimensions and player strengths.

    Parameters
    ----------
    embeddings_nd : numpy.ndarray
        A 2D array of shape (num_players, 3) containing 3D projection coordinates (e.g., from UMAP).
    player_strengths : numpy.ndarray
        A 1D array of shape (num_players,) containing the strength values of each player.

    Returns
    -------
    avg_abs_corr : float
        The average of the absolute Pearson correlation coefficients computed across all three embedding dimensions.

    Notes
    -----
    - Includes player-zero which is used for padding
    """

    correlations = []
    print(f"player_strengths.shape: {player_strengths.shape}" )
    print(f"embeddings_nd[:, 0].shape : {embeddings_nd[:, 0].shape}" )
    print(f"Embeddings shape: {embeddings_nd.shape}")
    n_dims = embeddings_nd.shape[1]
    
    for dim in range(n_dims):
        corr, p_value = pearsonr(embeddings_nd[:, dim], player_strengths)
        correlations.append((corr, p_value))
        print(f"Dimension {dim + 1} correlation with base strengths: r = {corr:.4f}, p-value = {p_value:.4g}")

    # compute average absolute correlation across all 3 dimensions
    avg_abs_corr = np.mean([abs(c[0]) for c in correlations])
    print(f"Average absolute correlation across {n_dims} components: {avg_abs_corr:.4f}")
    
    return avg_abs_corr
avg_abs_corr = compute_correlations_for_projected_dims(embeddings_3d, player_strengths)
player_strengths.shape: (31,)
embeddings_nd[:, 0].shape : (31,)
Embeddings shape: (31, 3)
Dimension 1 correlation with base strengths: r = -0.0091, p-value = 0.9613
Dimension 2 correlation with base strengths: r = -0.1202, p-value = 0.5194
Dimension 3 correlation with base strengths: r = -0.2317, p-value = 0.2097
Average absolute correlation across 3 components: 0.1203

Intermediate conclusions#

We were able to observe that model is able to learn meaningful embeddings but we used much more data point than we realistically can collect in real life.

Let’s keep modelling and try to keep number of games low and focus on better utilization of data and finding a better model ( hyperparameters search)

Tuning attendion layer#

Training best model after searching for attendion hyperparameters#

print(f"HPS: {best_avg_hp_atten.values}. Avg MSE: {best_avg_hp_val_score_atten}.")
model_atten = tuner.hypermodel.build(best_avg_hp_atten)

X_train = [teamA_data, teamB_data]
y_train = [outcomes]

history = model_atten.fit(X_train, y_train, epochs=50)

# Access training and validation loss per epoch
train_loss = history.history['loss'][-1]
HPS: {'attention_heads': 1, 'attention_dropout_rate': 0.2, 'player_emb_dim': 32, 'dense_units': 64, 'dense_units_2': 80, 'learning_rate': 0.01, 'dropout_rate': 0.30000000000000004}. Avg MSE: 0.1817317560315132.
Epoch 1/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 1s 5ms/step - loss: 1.3455 - mean_absolute_error: 0.9445
Epoch 2/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 1.0117 - mean_absolute_error: 0.8138 
Epoch 3/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.7855 - mean_absolute_error: 0.7535 
Epoch 4/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.5023 - mean_absolute_error: 0.5457 
Epoch 5/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.3850 - mean_absolute_error: 0.4815 
Epoch 6/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.4245 - mean_absolute_error: 0.4990 
Epoch 7/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2923 - mean_absolute_error: 0.4029 
Epoch 8/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.3692 - mean_absolute_error: 0.4665 
Epoch 9/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.3095 - mean_absolute_error: 0.4085 
Epoch 10/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.4343 - mean_absolute_error: 0.4690 
Epoch 11/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2921 - mean_absolute_error: 0.4161 
Epoch 12/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2327 - mean_absolute_error: 0.3667 
Epoch 13/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2730 - mean_absolute_error: 0.3825 
Epoch 14/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.4223 - mean_absolute_error: 0.5284 
Epoch 15/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2238 - mean_absolute_error: 0.3585 
Epoch 16/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1806 - mean_absolute_error: 0.3135 
Epoch 17/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2229 - mean_absolute_error: 0.3521 
Epoch 18/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1644 - mean_absolute_error: 0.2804 
Epoch 19/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2644 - mean_absolute_error: 0.3560 
Epoch 20/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1198 - mean_absolute_error: 0.2362 
Epoch 21/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1569 - mean_absolute_error: 0.2927 
Epoch 22/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2277 - mean_absolute_error: 0.3071 
Epoch 23/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2128 - mean_absolute_error: 0.3382 
Epoch 24/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1618 - mean_absolute_error: 0.2938 
Epoch 25/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.2411 - mean_absolute_error: 0.3649 
Epoch 26/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1965 - mean_absolute_error: 0.3228 
Epoch 27/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1612 - mean_absolute_error: 0.2829 
Epoch 28/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1554 - mean_absolute_error: 0.2799 
Epoch 29/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.2556 - mean_absolute_error: 0.3685 
Epoch 30/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1772 - mean_absolute_error: 0.2805 
Epoch 31/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.2551 - mean_absolute_error: 0.3758 
Epoch 32/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1584 - mean_absolute_error: 0.2942 
Epoch 33/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.2003 - mean_absolute_error: 0.3258 
Epoch 34/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1838 - mean_absolute_error: 0.3105 
Epoch 35/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1673 - mean_absolute_error: 0.2909 
Epoch 36/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1710 - mean_absolute_error: 0.2996 
Epoch 37/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1705 - mean_absolute_error: 0.2993 
Epoch 38/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1536 - mean_absolute_error: 0.3109 
Epoch 39/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.2502 - mean_absolute_error: 0.3649 
Epoch 40/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1749 - mean_absolute_error: 0.3114 
Epoch 41/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.2008 - mean_absolute_error: 0.3261 
Epoch 42/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.2374 - mean_absolute_error: 0.3402 
Epoch 43/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1570 - mean_absolute_error: 0.2721 
Epoch 44/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.2205 - mean_absolute_error: 0.3196 
Epoch 45/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1755 - mean_absolute_error: 0.2726 
Epoch 46/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.2009 - mean_absolute_error: 0.3135 
Epoch 47/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1465 - mean_absolute_error: 0.2687 
Epoch 48/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1376 - mean_absolute_error: 0.2609 
Epoch 49/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1363 - mean_absolute_error: 0.2559 
Epoch 50/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1136 - mean_absolute_error: 0.2279 

Analysing tuned with attention layer embeddings#

player_embedding_layer_atten = model_atten.get_layer("player_embedding")
embeddings_best_kfold_atten = player_embedding_layer_atten.get_weights()[0]
embeddings_best_kfold_atten
array([[-1.34285182e-01, -9.77098942e-04,  1.32983893e-01,
        -2.96137750e-01, -2.60094881e-01,  7.81141520e-02,
         1.57388389e-01, -2.01701730e-01, -2.29163527e-01,
        -2.76097775e-01,  2.73134977e-01, -1.66368678e-01,
        -1.38616562e-02, -1.56429604e-01,  1.38611317e-01,
         1.66071057e-01,  2.27819085e-02, -2.04479665e-01,
         1.73342228e-03, -1.85199469e-01, -2.86820978e-01,
        -1.15554467e-01, -1.58041716e-02,  3.69090140e-02,
         1.99691504e-01,  5.63927293e-02, -1.10932171e-01,
         1.62031710e-01, -2.30990544e-01,  6.05045259e-02,
        -1.21727109e-01,  2.17789263e-01],
       [ 3.85632515e-01,  2.39664298e-02,  9.98795703e-02,
        -2.76439004e-02, -9.93382782e-02,  2.68701047e-01,
         1.34497151e-01, -3.64116907e-01, -2.08618902e-02,
         3.72807006e-03,  9.48455855e-02,  2.56322563e-01,
         3.73686492e-01,  3.20051134e-01,  3.48203152e-01,
        -2.83086181e-01,  2.01407596e-01, -3.58898222e-01,
         1.58128425e-01, -2.08791979e-02, -2.79307693e-01,
        -1.68734208e-01, -2.59489149e-01,  6.13549650e-02,
        -5.20748496e-02, -3.72699760e-02, -5.06495953e-01,
         2.27043375e-01, -2.76798010e-01,  2.01487258e-01,
        -1.63217813e-01,  4.45802659e-01],
       [ 1.07474476e-01, -3.60011727e-01,  1.71871603e-01,
         7.27575347e-02, -2.59161890e-01, -6.17232770e-02,
         3.44688594e-02,  2.48144016e-01,  1.45141080e-01,
        -2.21490204e-01, -7.46445283e-02,  3.67889293e-02,
        -1.59581320e-03,  5.38265370e-02,  2.90936112e-01,
        -2.26528317e-01,  1.91369921e-01, -9.24862474e-02,
        -3.15151393e-01,  2.64002144e-01,  1.30224042e-02,
        -2.19510287e-01, -3.20068061e-01,  1.20286450e-01,
        -2.12085187e-01, -1.04820952e-01,  5.35752550e-02,
        -3.64866793e-01, -2.03375876e-01, -2.31394395e-02,
         2.31916964e-01,  1.40116259e-01],
       [-3.93842645e-02,  1.22874632e-01,  1.65813729e-01,
        -4.83522937e-02, -3.71701092e-01, -2.28109717e-01,
        -1.49639532e-01, -1.85162097e-01,  3.47425967e-01,
         9.32578593e-02,  4.62752096e-02,  2.78952450e-01,
        -2.28122622e-01,  1.08194247e-01,  2.39948347e-01,
        -1.91059798e-01, -2.55553261e-03,  2.17562363e-01,
        -3.94561961e-02, -4.58115280e-01, -3.13401312e-01,
         4.41578627e-02,  1.20619372e-01,  1.94748089e-01,
        -1.19077740e-02,  1.92603201e-01, -1.53640360e-01,
         2.88681430e-03, -3.07681322e-01,  2.58811861e-01,
         3.05873398e-02,  5.48785962e-02],
       [-1.91225335e-01,  3.52308095e-01,  1.52010143e-01,
        -3.21381897e-01, -1.79947346e-01,  2.65217423e-01,
        -2.50676572e-01,  3.30327123e-01,  4.75358451e-04,
        -1.50666967e-01,  3.28275412e-01, -1.17522396e-01,
        -1.47865564e-01, -3.08850348e-01, -7.17680454e-02,
        -1.89767987e-01, -1.88236967e-01,  2.94593990e-01,
         1.88883662e-01, -1.13763861e-01,  2.14995444e-01,
         2.59113163e-01,  2.19352067e-01,  2.00450763e-01,
         1.08385667e-01,  1.34112924e-01,  5.43098524e-03,
        -1.72180697e-01,  8.51625949e-02, -2.81230539e-01,
         3.32814045e-02, -3.29563528e-01],
       [-3.69539827e-01, -5.57788722e-02, -3.70620430e-01,
         2.06963181e-01,  1.33500069e-01,  2.89854795e-01,
        -2.63497263e-01, -9.72411335e-02,  1.91336364e-01,
         5.53254448e-02, -1.59436896e-01, -2.33075753e-01,
        -1.14016533e-01, -2.52051502e-01, -2.13615596e-01,
        -1.12193272e-01, -3.88413459e-01, -2.54102170e-01,
         4.47911829e-01, -4.34221298e-01,  5.70195578e-02,
        -1.21942945e-01, -1.05633382e-02,  1.83157876e-01,
         2.52690524e-01, -3.48953694e-01,  4.03640002e-01,
         3.60114463e-02,  9.85838622e-02, -2.26392210e-01,
        -1.52321443e-01, -2.00434476e-01],
       [-3.76693606e-02, -4.73901350e-03, -1.92716822e-01,
         5.18548153e-02,  2.84674406e-01,  6.61006868e-02,
        -4.22676712e-01,  9.58403945e-02,  4.63023782e-02,
         1.68123797e-01, -6.30788356e-02,  1.18683666e-01,
         1.75273389e-01, -4.05956417e-01,  1.64716870e-01,
         1.75126314e-01, -5.49141923e-03, -4.62863892e-02,
         3.72669011e-01, -1.05008282e-01,  4.90157247e-01,
         5.48751295e-01,  1.34730279e-01,  3.11898351e-01,
         2.53590941e-01, -3.14721167e-01, -1.07643157e-01,
        -2.98192501e-01,  4.84658569e-01,  1.93670377e-01,
        -5.25853112e-02, -1.76088721e-01],
       [ 1.33704439e-01,  1.23501383e-02,  8.23998153e-02,
         2.23940104e-01, -1.57839596e-01, -5.57082072e-02,
         4.64750588e-01, -4.35537666e-01,  3.37213129e-01,
        -1.72989935e-01,  1.45287141e-01,  2.69722849e-01,
        -2.58478194e-01, -9.96046960e-02,  7.53745735e-02,
         9.39335078e-02,  1.81388348e-01,  1.51449472e-01,
         1.89061970e-01, -1.67554364e-01, -1.05201885e-01,
        -3.76707107e-01, -2.12093890e-01, -3.71742755e-01,
         3.54910083e-02,  1.43337518e-01,  1.78265542e-01,
        -2.36924261e-01,  1.06647521e-01, -1.50353685e-01,
        -3.05523574e-01,  3.56918842e-01],
       [ 2.17829660e-01, -1.45851910e-01,  1.90488473e-01,
        -8.87721870e-03,  1.60650715e-01, -3.23962450e-01,
        -2.32345268e-01, -1.85999110e-01, -5.11286795e-01,
        -1.06698938e-01, -1.08918518e-01, -1.40332030e-02,
        -1.83466271e-01,  1.89666197e-01, -1.49325132e-01,
        -2.15990186e-01, -3.25171411e-01,  2.72766918e-01,
         2.00018153e-01,  1.98966712e-01, -2.35261291e-01,
        -2.44623184e-01,  6.43902570e-02,  2.88312912e-01,
         3.11890602e-01,  6.07974119e-02, -9.20320898e-02,
         9.21395421e-02, -3.07670176e-01, -8.57824907e-02,
         1.46260127e-01,  1.91773936e-01],
       [-2.91575968e-01,  2.02034295e-01,  1.69262648e-01,
         1.66132867e-01, -5.31428814e-01,  1.27810389e-01,
         1.32591888e-01, -2.43692279e-01,  6.02408312e-02,
        -1.10537902e-01, -6.12150133e-02, -9.43618566e-02,
        -1.91880614e-01, -1.22245125e-01,  1.76650032e-01,
         2.32614189e-01,  2.66828179e-01, -2.27816120e-01,
        -4.13382024e-01, -1.02030441e-01, -4.11907613e-01,
        -1.14783891e-01,  1.87953457e-01, -1.41458586e-01,
         2.01012447e-01,  1.95448503e-01,  8.76908749e-03,
         2.48567127e-02, -2.40643099e-01, -2.22589016e-01,
         2.18920201e-01,  2.18503371e-01],
       [ 8.99512693e-03,  2.39808466e-02, -4.20191973e-01,
        -2.84139127e-01, -1.68445349e-01, -1.32778576e-02,
        -1.99887902e-01,  4.31026191e-01,  1.44746900e-01,
         3.73803318e-01,  2.12210342e-01, -2.20744729e-01,
        -3.36483568e-01, -2.28693619e-01, -1.37325302e-01,
        -9.40403715e-02, -3.76631886e-01,  7.34958723e-02,
        -1.47574311e-02,  5.39525636e-02, -1.11586377e-01,
         1.66844234e-01,  1.96441505e-02, -4.80843335e-02,
         4.67257425e-02, -2.11084366e-01,  7.54115582e-02,
         2.27643445e-01,  1.44200370e-01,  3.59019399e-01,
         1.32868096e-01, -3.85980636e-01],
       [ 2.03640983e-01, -4.27265376e-01,  2.18273383e-02,
        -1.43718377e-01,  1.49384111e-01, -1.49469137e-01,
         4.39903587e-01, -3.44079733e-01,  2.63132870e-01,
        -4.39678222e-01, -1.23141244e-01,  4.16474909e-01,
         1.51400059e-01, -5.75700440e-02,  1.12548068e-01,
        -2.92565733e-01, -1.84622724e-02,  2.40444019e-01,
        -5.28256558e-02,  4.08473723e-02, -3.44204009e-01,
        -2.10006833e-01, -1.95545271e-01, -2.14268982e-01,
         2.89944354e-02, -2.47954614e-02,  1.96160208e-02,
         1.71856016e-01, -3.68098795e-01, -1.16007604e-01,
         1.34013653e-01,  3.13068986e-01],
       [ 6.40369132e-02, -2.20305547e-01,  3.99563909e-01,
         3.32048275e-02, -2.71872848e-01, -6.58557341e-02,
         2.24627018e-01,  7.11895078e-02,  3.60199630e-01,
        -4.19167638e-01, -7.89859593e-02,  3.45550269e-01,
         1.68020859e-01,  2.66530693e-01, -1.72208250e-01,
        -7.88022727e-02,  1.93089470e-01, -2.57682741e-01,
        -1.30464379e-02, -4.49595340e-02, -3.43759447e-01,
         8.74800757e-02,  2.37710699e-01,  5.99995488e-04,
        -2.47942895e-01, -1.01370193e-01,  1.10828891e-01,
         2.83871144e-01,  1.81337729e-01,  1.86178878e-01,
        -2.52271473e-01,  3.42921853e-01],
       [-5.17896563e-02,  3.92047703e-01, -1.77984625e-01,
        -1.33590430e-01,  2.31920198e-01, -6.20363541e-02,
        -2.14264810e-01,  1.87744349e-01,  8.84165540e-02,
        -5.63145168e-02,  3.75749975e-01,  2.31281504e-01,
        -7.17856660e-02,  2.04074278e-01, -1.93970546e-01,
         4.48004380e-02, -1.03587411e-01,  1.87282488e-01,
         2.11468205e-01, -2.12167591e-01,  6.25578836e-02,
         2.67764572e-02,  4.94780689e-01, -4.11182493e-01,
         1.60596699e-01, -2.67535865e-01, -3.47825855e-01,
        -1.66792963e-02, -1.43224299e-01, -2.14022890e-01,
        -4.53868732e-02, -2.93997556e-01],
       [ 6.93498328e-02, -9.64218825e-02, -4.27941948e-01,
        -2.98398882e-01,  3.09333116e-01, -4.51354444e-01,
         1.97003096e-01,  1.36516005e-01, -1.13227062e-01,
         2.56400347e-01,  3.01305149e-02, -4.45362270e-01,
        -4.03937608e-01, -1.51357293e-01,  2.54331917e-01,
        -1.33539528e-01, -1.34779504e-02,  1.92708010e-03,
         3.72728735e-01, -2.58887419e-03,  2.24240586e-01,
         2.15847000e-01,  5.53030409e-02,  9.78745446e-02,
        -6.25168532e-02,  3.30845982e-01, -1.08568646e-01,
        -4.27677445e-02, -1.12627923e-01, -1.93343639e-01,
         7.76161030e-02, -3.94432873e-01],
       [ 1.51769109e-02,  3.58218700e-01, -7.28907064e-02,
        -1.07447915e-01,  3.13144922e-01,  2.35901117e-01,
        -3.12567085e-01,  2.64391694e-02, -3.53183091e-01,
         1.02890074e-01, -2.95934260e-01, -3.26224566e-01,
         2.63283588e-02,  3.17395121e-01, -2.65348285e-01,
         2.81411350e-01, -4.22010571e-01, -9.69830677e-02,
         4.02993739e-01,  3.10890496e-01,  3.69920999e-01,
         6.73853680e-02, -5.60968816e-02, -2.00464472e-01,
        -2.48236015e-01,  1.25158891e-01, -2.25628585e-01,
        -2.11209521e-01,  2.19609156e-01, -1.23274408e-01,
        -4.14862245e-01, -2.61266176e-02],
       [-8.76101777e-02,  2.55460739e-01, -6.05293587e-02,
         7.23046586e-02,  1.45557433e-01, -1.26377314e-01,
        -3.33335161e-01, -1.51877165e-01,  3.63078088e-01,
         2.99562067e-01, -1.19766362e-01, -1.59921609e-02,
         3.55498314e-01, -1.33981332e-01, -4.46531773e-02,
        -5.72528504e-03,  3.39101404e-02, -1.41787618e-01,
         1.53821290e-01,  4.20315489e-02,  2.94367177e-03,
        -1.24529809e-01, -2.01427052e-03,  1.87473878e-01,
        -3.92265469e-01,  4.93475161e-02, -1.64975703e-01,
         1.11780539e-01,  2.14837655e-01, -1.37897685e-01,
        -1.46110788e-01,  1.48296878e-01],
       [-1.89136326e-01,  8.58449489e-02,  6.08177036e-02,
         1.12124145e-01,  3.99533123e-01,  2.11128071e-01,
         1.67633723e-02,  3.30886185e-01,  1.38526082e-01,
        -3.47841047e-02, -5.68665266e-01, -4.59133387e-02,
         2.50389308e-01,  2.36169189e-01,  3.18303704e-01,
         3.31471503e-01, -4.59485389e-02, -2.14321241e-01,
        -3.54998082e-01, -2.38328308e-01, -1.30061448e-01,
        -1.20021462e-01,  2.84728050e-01,  2.33906925e-01,
         1.24498390e-01,  7.73801282e-02, -1.46192774e-01,
        -4.48541969e-01,  1.91204906e-01,  2.23280694e-02,
         1.06595382e-01,  4.25853021e-02],
       [-3.72577041e-01, -2.07097709e-01, -7.81059712e-02,
        -2.98726350e-01, -2.48550355e-01, -1.52017131e-01,
        -3.76481324e-01, -1.44974649e-01, -2.58263379e-01,
         2.72063732e-01,  8.56484398e-02,  2.34896049e-01,
         2.16885865e-01,  3.44447792e-01, -1.42348155e-01,
         4.57669869e-02, -2.26421550e-01, -5.77948838e-02,
        -1.60091355e-01,  3.35183859e-01, -8.93779919e-02,
         5.59169017e-02,  1.02905802e-01, -3.23905766e-01,
         2.73254037e-01,  2.34780982e-01,  6.17959350e-02,
         1.32995680e-01, -2.74627000e-01,  3.22539479e-01,
         1.61098585e-01, -5.95302656e-02],
       [ 6.08893670e-02,  2.94973850e-01,  1.91975951e-01,
         2.15079933e-01,  2.23049387e-01, -1.75302885e-02,
        -1.09646238e-01,  1.57571971e-01,  1.36647746e-01,
         2.42683306e-01, -1.23502433e-01, -4.70202006e-02,
        -1.72215968e-01, -1.94420651e-01, -2.55026042e-01,
        -2.84773763e-02, -1.95159897e-01,  1.66856021e-01,
        -4.52061519e-02, -2.58080244e-01, -5.64571992e-02,
         2.02594116e-01,  3.06278139e-01, -5.62524572e-02,
         8.95047374e-03,  4.11082506e-01,  2.03123957e-01,
         7.60015398e-02,  2.07712576e-01, -2.82317847e-01,
         2.06243038e-01, -5.68134338e-02],
       [-1.89466745e-01, -1.49305120e-01,  3.76006700e-02,
         1.25557482e-01, -4.76155081e-04,  2.22341865e-01,
        -8.04658532e-02, -1.43274426e-01,  5.94358221e-02,
        -3.45774889e-01, -4.49711122e-02,  4.93939826e-03,
         2.35558569e-01,  4.96722251e-01,  1.23298308e-02,
         7.49860108e-02,  3.01174164e-01,  7.78161883e-02,
        -3.28969866e-01,  2.18970329e-01,  4.18482674e-03,
         2.90850922e-02,  1.15978897e-01,  2.57300198e-01,
        -2.26725921e-01,  2.31681496e-01, -4.22798514e-01,
        -3.16964120e-01, -1.49570452e-02, -2.04489410e-01,
        -4.73577566e-02,  3.60747389e-02],
       [ 2.16454536e-01,  4.13266957e-01,  6.17023520e-02,
        -1.27992481e-01, -5.75363338e-02,  3.82424220e-02,
         3.39564234e-02,  5.24761640e-02,  2.33981133e-01,
         1.78686947e-01,  6.18254058e-02, -2.31486201e-01,
        -1.38186412e-02, -4.12617922e-01,  4.33822051e-02,
         2.70625323e-01,  5.52847348e-02,  2.45945156e-01,
        -3.16964835e-01, -2.95645028e-01,  4.37472254e-01,
         6.87702671e-02, -8.96448866e-02,  2.38924280e-01,
        -2.96064407e-01, -2.44906515e-01, -4.72311974e-02,
        -1.07443698e-01, -2.50787605e-02, -1.55163795e-01,
         2.25799233e-01, -6.44753426e-02],
       [-8.56123418e-02,  4.92130108e-02,  3.32286698e-03,
         1.34202242e-01,  2.87130922e-01, -1.03818670e-01,
         1.92268953e-01,  2.30475470e-01,  2.44367093e-01,
         1.88948616e-01,  1.92384735e-01,  1.82329476e-01,
        -1.81592360e-01, -2.61132866e-01, -2.10868746e-01,
        -1.84733152e-01,  2.28336498e-01,  1.31800488e-01,
        -3.20400968e-02,  1.71115860e-01, -2.10262850e-01,
        -1.36633754e-01,  2.19761327e-01, -1.57940820e-01,
         1.63029417e-01,  2.07359582e-01,  4.02379215e-01,
         2.72812903e-01,  2.16850609e-01,  7.78961331e-02,
        -2.40709439e-01, -3.93041521e-01],
       [-2.49640971e-01, -9.55930725e-02,  3.07526201e-01,
         1.11051288e-03, -1.01620577e-01, -3.34184825e-01,
        -3.19604665e-01, -2.77912438e-01, -2.37324893e-01,
         1.24863133e-01, -2.43705325e-03, -2.49419689e-01,
         1.34990707e-01, -5.33067808e-02, -1.72954816e-02,
         3.58152479e-01,  7.13862181e-02, -3.58613938e-01,
         1.58551529e-01, -8.77742916e-02,  2.44704932e-01,
         2.46563330e-01, -8.09457302e-02,  2.77496949e-02,
        -7.35475123e-02, -8.08361918e-03,  6.73035458e-02,
         5.39566725e-02,  3.38112414e-01,  3.76745433e-01,
         1.28599375e-01, -1.12025492e-01],
       [ 2.30839610e-01, -2.06982300e-01, -1.30090401e-01,
         6.27370030e-02,  1.17940985e-01,  2.95307934e-01,
         3.39063257e-02, -5.82281910e-02,  8.10226500e-02,
        -1.83589995e-01, -1.43230587e-01, -2.97070026e-01,
        -7.05661327e-02,  2.76534736e-01, -3.83937776e-01,
         2.92423338e-01, -1.12819575e-01,  3.74777913e-01,
        -2.29757264e-01,  4.60702986e-01,  2.78704137e-01,
        -7.69118667e-02,  7.36552998e-02,  1.27481058e-01,
        -5.61090633e-02,  9.67343375e-02,  2.04222843e-01,
         1.02032132e-01,  3.70221324e-02,  2.38600876e-02,
        -3.00863504e-01,  3.10198069e-01],
       [-2.16247886e-01, -6.29711747e-02,  5.79759991e-03,
        -4.17030841e-01, -1.82081833e-01,  8.33490491e-02,
         1.11060493e-01, -2.28497341e-01,  2.57611815e-02,
        -3.19175810e-01, -3.32946301e-01,  2.36353297e-02,
         1.91350937e-01, -2.43332181e-02,  4.00920063e-01,
        -3.07608902e-01, -8.15874664e-04,  8.17326829e-02,
        -7.04793110e-02,  3.50989960e-02,  1.40434608e-01,
        -4.00028020e-01, -4.18134898e-01, -2.66489033e-02,
        -1.24654613e-01,  4.75972623e-01, -1.40353516e-01,
         3.52107793e-01,  1.54021513e-02,  6.19994029e-02,
         6.64168298e-02, -4.16782312e-02],
       [ 2.40852088e-01,  3.23952436e-01, -2.06614211e-02,
        -1.45393521e-01,  1.32330030e-01, -3.23387623e-01,
        -6.34365678e-02,  9.41102207e-03, -1.90779567e-01,
         2.56863803e-01,  1.39895812e-01, -2.55575418e-01,
         1.23102166e-01, -3.45060438e-01,  1.55761033e-01,
         2.19742388e-01, -1.48383334e-01, -3.03154826e-01,
        -2.15016916e-01, -8.70572180e-02,  4.53630537e-01,
         3.82599890e-01, -2.10315898e-01,  3.58087957e-01,
         1.30476803e-01, -3.02741885e-01,  5.17966449e-02,
         2.91800052e-01,  1.51968881e-01,  1.81130841e-01,
        -9.08397660e-02, -9.77441818e-02],
       [ 2.45616883e-01,  1.57519281e-01,  3.44886601e-01,
         3.77990782e-01, -1.61291212e-01,  1.93869784e-01,
        -1.62718073e-01, -2.55999863e-01, -6.36522621e-02,
        -2.95602232e-01,  1.28836125e-01, -2.34476641e-01,
         3.13423246e-01,  3.73127133e-01, -6.35405481e-02,
        -1.53368628e-02,  3.11742067e-01, -6.82626739e-02,
         1.49725825e-01,  8.58728513e-02,  3.06421191e-01,
         2.82899290e-01, -3.19549769e-01, -2.81664103e-01,
        -2.31706247e-01, -3.64863314e-02,  1.74274936e-01,
         1.82708785e-01,  1.76496968e-01,  3.61615330e-01,
         1.76939890e-01, -1.26580581e-01],
       [ 1.24201007e-01, -1.67240486e-01,  2.67395210e-02,
         1.77342862e-01, -2.92314179e-02, -1.05217695e-01,
         3.31353188e-01,  1.86525732e-01, -3.38540167e-01,
        -2.44385019e-01, -3.52089733e-01, -2.32727244e-01,
         1.50475457e-01, -1.01175435e-01, -5.80635816e-02,
         2.21315578e-01,  3.53250295e-01, -9.98902544e-02,
        -4.55462858e-02,  2.45590255e-01,  2.44008213e-01,
         2.77105600e-01, -3.54034603e-01, -2.24312723e-01,
        -3.84171695e-01, -2.03774124e-01, -1.11730233e-01,
        -1.52058795e-01, -1.46733612e-01, -2.22112074e-01,
        -7.06918910e-02, -1.41397417e-01],
       [-3.25177968e-01,  2.18433529e-01, -3.04195076e-01,
        -8.30938369e-02,  2.56681573e-02, -4.45334874e-02,
         1.78154990e-01,  5.28235435e-01, -2.40015015e-01,
         3.53323013e-01,  2.48561099e-01, -3.85951549e-01,
        -3.11547399e-01, -3.26735824e-01, -3.12234282e-01,
        -6.05509169e-02, -1.07805654e-01, -2.19239190e-01,
         3.97826806e-02,  1.53150158e-02,  1.29805595e-01,
         9.47677568e-02,  2.11022586e-01,  8.79021287e-02,
         2.61398077e-01, -1.39553323e-01,  2.69465953e-01,
        -1.56645924e-02,  2.97405541e-01,  1.49905980e-01,
        -2.05956414e-01, -2.95097798e-01],
       [ 1.68292195e-01,  4.04509157e-02,  1.58256650e-01,
         2.36378595e-01,  3.43884416e-02,  3.40606421e-01,
         3.08441490e-01,  2.74646759e-01,  1.99675798e-01,
         7.08541870e-02, -2.34929621e-01,  2.63896883e-01,
         3.89584564e-02,  8.83784816e-02,  9.69139040e-02,
        -3.16762775e-01,  2.63493180e-01,  1.17729139e-02,
        -1.26045763e-01,  1.73609480e-01, -3.47986460e-01,
        -3.18591207e-01, -1.78658273e-02,  1.24995504e-02,
         7.58991912e-02, -1.91214263e-01,  1.80483162e-01,
        -1.92711800e-01, -8.81229118e-02, -7.40800332e-03,
         2.48155266e-01, -1.52697951e-01]], dtype=float32)
fig, embeddings_atten_3d  = umap_and_visualize(embeddings_best_kfold_atten, player_strengths)
fig

best_kfold_avg_abs_corr_atten = compute_correlations_for_projected_dims(embeddings_atten_3d, player_strengths)
player_strengths.shape: (31,)
embeddings_nd[:, 0].shape : (31,)
Embeddings shape: (31, 3)
Dimension 1 correlation with base strengths: r = 0.9013, p-value = 4.686e-12
Dimension 2 correlation with base strengths: r = 0.4950, p-value = 0.004638
Dimension 3 correlation with base strengths: r = -0.4726, p-value = 0.007261
Average absolute correlation across 3 components: 0.6230

Intermediate conclusions#

By searching for attention layer’s hyperparameters we managed to improve our embeddings from 0.6488 to 0.6901 - which is a significant improvement!

Early stopping and other overfitting mitigation technics#

Defining model. Adding callbacks with early stopping#

We don’t need to change a model specification itself. Callbacks can be added through a keras tuner

from tensorflow.keras.callbacks import EarlyStopping

early_stop = EarlyStopping(monitor='val_loss', patience=5, restore_best_weights=True)

# Define a learning rate schedule function (step decay example)
def lr_schedule(epoch, lr):
    drop_rate = 0.5
    epochs_drop = 10
    if epoch > 0 and epoch % epochs_drop == 0:
        return lr * drop_rate
    return lr

#Instantiate callback
lr_scheduler = tf.keras.callbacks.LearningRateScheduler(lr_schedule)

# Or adaptive reduction on plateau (reduce LR when val_loss stalls)
reduce_lr = tf.keras.callbacks.ReduceLROnPlateau(monitor='val_loss', factor=0.5, patience=5, min_lr=1e-6)

es_callbacks=[lr_scheduler, reduce_lr, early_stop]
# es_callbacks=[early_stop]


all_best_hps_es = hyperparameter_search(build_model_cv_atten, callbacks=es_callbacks)
teamA_data shape: (100, 9)
teamB_data shape: (100, 9)
outcomes shape: (100,)

FOLD 1
2025-08-09 17:01:20.137738: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:01:20.138111: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:01:25.945376: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:01:25.945756: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:01:33.219207: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:01:33.219647: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
/opt/anaconda3/envs/footballman/lib/python3.12/site-packages/keras/src/saving/saving_lib.py:757: UserWarning:

Skipping variable loading for optimizer 'adam', because it has 2 variables whereas the saved optimizer has 56 variables. 
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 242ms/step - loss: 0.3075 - mean_absolute_error: 0.4485
1: 0.30745211243629456     2: 0.4485432505607605  
0.30745211243629456

FOLD 2
2025-08-09 17:01:40.338147: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:01:40.338516: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:01:46.072390: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:01:46.072976: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:01:51.250711: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:01:51.251100: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:01:58.524212: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:01:58.524569: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 232ms/step - loss: 0.2053 - mean_absolute_error: 0.3278
1: 0.20530542731285095     2: 0.32781749963760376  
0.20530542731285095

FOLD 3
2025-08-09 17:02:05.064106: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:02:05.064492: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:02:11.407903: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:02:11.408264: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:02:20.063892: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:02:20.064324: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:02:25.784130: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:02:25.784884: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
1/1 ━━━━━━━━━━━━━━━━━━━━ 1s 751ms/step - loss: 0.2616 - mean_absolute_error: 0.4141
1: 0.261588990688324     2: 0.41407984495162964  
0.261588990688324

FOLD 4
2025-08-09 17:02:32.811224: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:02:32.811519: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:02:40.630555: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:02:40.630834: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:02:48.201801: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:02:48.202146: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 245ms/step - loss: 0.2157 - mean_absolute_error: 0.3561
1: 0.21570256352424622     2: 0.3561144471168518  
0.21570256352424622

FOLD 5
2025-08-09 17:02:55.797029: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:02:55.797308: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:03:01.523259: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:03:01.523596: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:03:07.352103: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:03:07.352452: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:03:12.761869: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:03:12.762214: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 319ms/step - loss: 0.1392 - mean_absolute_error: 0.2918
1: 0.13918931782245636     2: 0.2917974293231964  
0.13918931782245636

Best hyperparameters found:
player_emb_dim: 32
dense_units: 16
dense_units_2: 112
learning_rate: 0.001
dropout_rate: 0.2
def train_best_hps_model(best_hps):
    avg_hps_losses = evaluate_avg_hps_performance(best_hps)
    
    best_avg_hp, best_avg_hp_val_score = selecting_best_avg_hps(avg_hps_losses)
    
    print(f"HPS: {best_avg_hp.values}. Avg MSE: {best_avg_hp_val_score}.")
    model_hp = tuner.hypermodel.build(best_avg_hp)

    X_train = [teamA_data, teamB_data]
    y_train = [outcomes]

    history = model_hp.fit(X_train, y_train, epochs=50)

    train_loss = history.history['loss'][-1]
    return model_hp, train_loss
model_hp, model_hp_train_loss = train_best_hps_model(all_best_hps_es)
HPS: {'player_emb_dim': 32, 'dense_units': 16, 'dense_units_2': 112, 'learning_rate': 0.001, 'dropout_rate': 0.2}. MSE during RandomSearch: 0.30745211243629456. Starting evaluation across all k folds...

FOLD 1
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 2s 103ms/step - loss: 1.0038 - mean_absolute_error: 0.7957 - val_loss: 1.1184 - val_mean_absolute_error: 0.8571
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.9078 - mean_absolute_error: 0.7587 - val_loss: 1.1480 - val_mean_absolute_error: 0.8713
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.8662 - mean_absolute_error: 0.7512 - val_loss: 1.1693 - val_mean_absolute_error: 0.8818
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.8122 - mean_absolute_error: 0.7133 - val_loss: 1.1545 - val_mean_absolute_error: 0.8794
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.6923 - mean_absolute_error: 0.6570 - val_loss: 1.1224 - val_mean_absolute_error: 0.8611
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.6514 - mean_absolute_error: 0.6538 - val_loss: 0.9974 - val_mean_absolute_error: 0.7915
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.5943 - mean_absolute_error: 0.6172 - val_loss: 0.9146 - val_mean_absolute_error: 0.7459
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.5195 - mean_absolute_error: 0.5731 - val_loss: 0.8880 - val_mean_absolute_error: 0.7105
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.3507 - mean_absolute_error: 0.4916 - val_loss: 0.8924 - val_mean_absolute_error: 0.7063
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.3727 - mean_absolute_error: 0.5131 - val_loss: 0.8047 - val_mean_absolute_error: 0.6685
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.4566 - mean_absolute_error: 0.5370 - val_loss: 0.7642 - val_mean_absolute_error: 0.6572
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.3431 - mean_absolute_error: 0.4721 - val_loss: 0.6786 - val_mean_absolute_error: 0.6324
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.3399 - mean_absolute_error: 0.4517 - val_loss: 0.6583 - val_mean_absolute_error: 0.6215
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.3188 - mean_absolute_error: 0.4570 - val_loss: 0.6686 - val_mean_absolute_error: 0.6220
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.2118 - mean_absolute_error: 0.3640 - val_loss: 0.6346 - val_mean_absolute_error: 0.6038
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.2466 - mean_absolute_error: 0.4056 - val_loss: 0.5816 - val_mean_absolute_error: 0.5828
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.2549 - mean_absolute_error: 0.3899 - val_loss: 0.5659 - val_mean_absolute_error: 0.5785
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1675 - mean_absolute_error: 0.3211 - val_loss: 0.5937 - val_mean_absolute_error: 0.5816
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1738 - mean_absolute_error: 0.3292 - val_loss: 0.5782 - val_mean_absolute_error: 0.5718
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.2332 - mean_absolute_error: 0.3738 - val_loss: 0.5176 - val_mean_absolute_error: 0.5541
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.2051 - mean_absolute_error: 0.3446 - val_loss: 0.5134 - val_mean_absolute_error: 0.5602
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1984 - mean_absolute_error: 0.3192 - val_loss: 0.5285 - val_mean_absolute_error: 0.5715
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.2231 - mean_absolute_error: 0.3465 - val_loss: 0.4861 - val_mean_absolute_error: 0.5532
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1341 - mean_absolute_error: 0.2778 - val_loss: 0.4270 - val_mean_absolute_error: 0.5168
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1832 - mean_absolute_error: 0.2992 - val_loss: 0.4071 - val_mean_absolute_error: 0.4994
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step - loss: 0.1649 - mean_absolute_error: 0.3124 - val_loss: 0.4200 - val_mean_absolute_error: 0.4947
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - loss: 0.1477 - mean_absolute_error: 0.2947 - val_loss: 0.4174 - val_mean_absolute_error: 0.4915
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1280 - mean_absolute_error: 0.2586 - val_loss: 0.4111 - val_mean_absolute_error: 0.4837
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1154 - mean_absolute_error: 0.2694 - val_loss: 0.3893 - val_mean_absolute_error: 0.4706
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1679 - mean_absolute_error: 0.3136 - val_loss: 0.3834 - val_mean_absolute_error: 0.4677
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1480 - mean_absolute_error: 0.2902 - val_loss: 0.3784 - val_mean_absolute_error: 0.4748
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1726 - mean_absolute_error: 0.3032 - val_loss: 0.3674 - val_mean_absolute_error: 0.4856
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1501 - mean_absolute_error: 0.2864 - val_loss: 0.3744 - val_mean_absolute_error: 0.4836
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1442 - mean_absolute_error: 0.2733 - val_loss: 0.3651 - val_mean_absolute_error: 0.4681
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1302 - mean_absolute_error: 0.2670 - val_loss: 0.3364 - val_mean_absolute_error: 0.4422
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0961 - mean_absolute_error: 0.2278 - val_loss: 0.3177 - val_mean_absolute_error: 0.4343
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0808 - mean_absolute_error: 0.2153 - val_loss: 0.3015 - val_mean_absolute_error: 0.4377
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1395 - mean_absolute_error: 0.2639 - val_loss: 0.2924 - val_mean_absolute_error: 0.4335
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1077 - mean_absolute_error: 0.2490 - val_loss: 0.2942 - val_mean_absolute_error: 0.4262
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1209 - mean_absolute_error: 0.2559 - val_loss: 0.3072 - val_mean_absolute_error: 0.4210
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0898 - mean_absolute_error: 0.2380 - val_loss: 0.3024 - val_mean_absolute_error: 0.4117
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1003 - mean_absolute_error: 0.2280 - val_loss: 0.2663 - val_mean_absolute_error: 0.3987
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1097 - mean_absolute_error: 0.2252 - val_loss: 0.2544 - val_mean_absolute_error: 0.4023
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1270 - mean_absolute_error: 0.2526 - val_loss: 0.2523 - val_mean_absolute_error: 0.4021
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1147 - mean_absolute_error: 0.2286 - val_loss: 0.2584 - val_mean_absolute_error: 0.3957
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1240 - mean_absolute_error: 0.2260 - val_loss: 0.2719 - val_mean_absolute_error: 0.4061
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0928 - mean_absolute_error: 0.2190 - val_loss: 0.2656 - val_mean_absolute_error: 0.4065
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1094 - mean_absolute_error: 0.2484 - val_loss: 0.2565 - val_mean_absolute_error: 0.4055
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1140 - mean_absolute_error: 0.2508 - val_loss: 0.2614 - val_mean_absolute_error: 0.4068
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0922 - mean_absolute_error: 0.2164 - val_loss: 0.2821 - val_mean_absolute_error: 0.4113

FOLD 2
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - loss: 0.1567 - mean_absolute_error: 0.2968 - val_loss: 0.0255 - val_mean_absolute_error: 0.1078
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1726 - mean_absolute_error: 0.2832 - val_loss: 0.0403 - val_mean_absolute_error: 0.1536
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1566 - mean_absolute_error: 0.2952 - val_loss: 0.0499 - val_mean_absolute_error: 0.1783
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1190 - mean_absolute_error: 0.2489 - val_loss: 0.0529 - val_mean_absolute_error: 0.1885
Epoch 5/50
2025-08-09 17:03:19.887145: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:03:19.887539: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1018 - mean_absolute_error: 0.2421 - val_loss: 0.0558 - val_mean_absolute_error: 0.1898
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1118 - mean_absolute_error: 0.2424 - val_loss: 0.0633 - val_mean_absolute_error: 0.1867
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1217 - mean_absolute_error: 0.2659 - val_loss: 0.0580 - val_mean_absolute_error: 0.1762
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0876 - mean_absolute_error: 0.2294 - val_loss: 0.0513 - val_mean_absolute_error: 0.1785
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1062 - mean_absolute_error: 0.2345 - val_loss: 0.0664 - val_mean_absolute_error: 0.1979
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1007 - mean_absolute_error: 0.2340 - val_loss: 0.0630 - val_mean_absolute_error: 0.1978
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0843 - mean_absolute_error: 0.2099 - val_loss: 0.0661 - val_mean_absolute_error: 0.1982
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1245 - mean_absolute_error: 0.2464 - val_loss: 0.0689 - val_mean_absolute_error: 0.1885
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1035 - mean_absolute_error: 0.2248 - val_loss: 0.0608 - val_mean_absolute_error: 0.1730
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0729 - mean_absolute_error: 0.1877 - val_loss: 0.0617 - val_mean_absolute_error: 0.1860
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1113 - mean_absolute_error: 0.2278 - val_loss: 0.0604 - val_mean_absolute_error: 0.1784
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0946 - mean_absolute_error: 0.2102 - val_loss: 0.0591 - val_mean_absolute_error: 0.1724
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0794 - mean_absolute_error: 0.1965 - val_loss: 0.0587 - val_mean_absolute_error: 0.1769
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0850 - mean_absolute_error: 0.2018 - val_loss: 0.0578 - val_mean_absolute_error: 0.1771
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0789 - mean_absolute_error: 0.1948 - val_loss: 0.0578 - val_mean_absolute_error: 0.1795
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0813 - mean_absolute_error: 0.2165 - val_loss: 0.0608 - val_mean_absolute_error: 0.1828
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0675 - mean_absolute_error: 0.1831 - val_loss: 0.0709 - val_mean_absolute_error: 0.1949
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0653 - mean_absolute_error: 0.1947 - val_loss: 0.0708 - val_mean_absolute_error: 0.1983
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0697 - mean_absolute_error: 0.1901 - val_loss: 0.0730 - val_mean_absolute_error: 0.2070
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1158 - mean_absolute_error: 0.2301 - val_loss: 0.0769 - val_mean_absolute_error: 0.2178
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0588 - mean_absolute_error: 0.1713 - val_loss: 0.0648 - val_mean_absolute_error: 0.1923
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0836 - mean_absolute_error: 0.1922 - val_loss: 0.0643 - val_mean_absolute_error: 0.1801
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0537 - mean_absolute_error: 0.1789 - val_loss: 0.0662 - val_mean_absolute_error: 0.1789
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0636 - mean_absolute_error: 0.1788 - val_loss: 0.0711 - val_mean_absolute_error: 0.1905
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0754 - mean_absolute_error: 0.1966 - val_loss: 0.0748 - val_mean_absolute_error: 0.1974
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0793 - mean_absolute_error: 0.2060 - val_loss: 0.0649 - val_mean_absolute_error: 0.1815
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0771 - mean_absolute_error: 0.1961 - val_loss: 0.0597 - val_mean_absolute_error: 0.1790
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0524 - mean_absolute_error: 0.1680 - val_loss: 0.0589 - val_mean_absolute_error: 0.1852
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0651 - mean_absolute_error: 0.1863 - val_loss: 0.0610 - val_mean_absolute_error: 0.1799
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0800 - mean_absolute_error: 0.1893 - val_loss: 0.0674 - val_mean_absolute_error: 0.1945
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0543 - mean_absolute_error: 0.1630 - val_loss: 0.0733 - val_mean_absolute_error: 0.2077
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0915 - mean_absolute_error: 0.1934 - val_loss: 0.0702 - val_mean_absolute_error: 0.2028
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0667 - mean_absolute_error: 0.1821 - val_loss: 0.0636 - val_mean_absolute_error: 0.1982
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0593 - mean_absolute_error: 0.1840 - val_loss: 0.0612 - val_mean_absolute_error: 0.1944
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0763 - mean_absolute_error: 0.1932 - val_loss: 0.0645 - val_mean_absolute_error: 0.1944
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0541 - mean_absolute_error: 0.1549 - val_loss: 0.0668 - val_mean_absolute_error: 0.1948
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0559 - mean_absolute_error: 0.1568 - val_loss: 0.0692 - val_mean_absolute_error: 0.1965
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0721 - mean_absolute_error: 0.1827 - val_loss: 0.0675 - val_mean_absolute_error: 0.1960
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0511 - mean_absolute_error: 0.1606 - val_loss: 0.0665 - val_mean_absolute_error: 0.1930
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0855 - mean_absolute_error: 0.1792 - val_loss: 0.0661 - val_mean_absolute_error: 0.1854
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0527 - mean_absolute_error: 0.1601 - val_loss: 0.0744 - val_mean_absolute_error: 0.1863
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.0894 - mean_absolute_error: 0.2212 - val_loss: 0.0922 - val_mean_absolute_error: 0.2140
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0612 - mean_absolute_error: 0.1701 - val_loss: 0.1138 - val_mean_absolute_error: 0.2426
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0737 - mean_absolute_error: 0.1959 - val_loss: 0.1087 - val_mean_absolute_error: 0.2303
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0710 - mean_absolute_error: 0.1817 - val_loss: 0.0969 - val_mean_absolute_error: 0.2143
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0718 - mean_absolute_error: 0.1769 - val_loss: 0.0931 - val_mean_absolute_error: 0.2222

FOLD 3
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - loss: 0.0969 - mean_absolute_error: 0.2229 - val_loss: 0.0222 - val_mean_absolute_error: 0.0960
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1051 - mean_absolute_error: 0.2402 - val_loss: 0.0171 - val_mean_absolute_error: 0.0814
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0829 - mean_absolute_error: 0.1995 - val_loss: 0.0265 - val_mean_absolute_error: 0.1150
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0823 - mean_absolute_error: 0.2171 - val_loss: 0.0345 - val_mean_absolute_error: 0.1368
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0701 - mean_absolute_error: 0.1857 - val_loss: 0.0303 - val_mean_absolute_error: 0.1235
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0559 - mean_absolute_error: 0.1654 - val_loss: 0.0262 - val_mean_absolute_error: 0.1149
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0665 - mean_absolute_error: 0.1784 - val_loss: 0.0298 - val_mean_absolute_error: 0.1164
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0828 - mean_absolute_error: 0.2126 - val_loss: 0.0250 - val_mean_absolute_error: 0.1071
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0736 - mean_absolute_error: 0.1893 - val_loss: 0.0216 - val_mean_absolute_error: 0.0968
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 58ms/step - loss: 0.0759 - mean_absolute_error: 0.1762 - val_loss: 0.0233 - val_mean_absolute_error: 0.1020
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0877 - mean_absolute_error: 0.2024 - val_loss: 0.0280 - val_mean_absolute_error: 0.1214
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0909 - mean_absolute_error: 0.2110 - val_loss: 0.0230 - val_mean_absolute_error: 0.1045
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0596 - mean_absolute_error: 0.1825 - val_loss: 0.0270 - val_mean_absolute_error: 0.1189
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1211 - mean_absolute_error: 0.2268 - val_loss: 0.0314 - val_mean_absolute_error: 0.1280
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0799 - mean_absolute_error: 0.2018 - val_loss: 0.0263 - val_mean_absolute_error: 0.1122
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0466 - mean_absolute_error: 0.1555 - val_loss: 0.0239 - val_mean_absolute_error: 0.1091
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0647 - mean_absolute_error: 0.1788 - val_loss: 0.0262 - val_mean_absolute_error: 0.1234
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0433 - mean_absolute_error: 0.1528 - val_loss: 0.0321 - val_mean_absolute_error: 0.1392
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0578 - mean_absolute_error: 0.1751 - val_loss: 0.0272 - val_mean_absolute_error: 0.1248
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0633 - mean_absolute_error: 0.1917 - val_loss: 0.0352 - val_mean_absolute_error: 0.1344
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0476 - mean_absolute_error: 0.1540 - val_loss: 0.0404 - val_mean_absolute_error: 0.1455
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0505 - mean_absolute_error: 0.1612 - val_loss: 0.0306 - val_mean_absolute_error: 0.1306
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0557 - mean_absolute_error: 0.1608 - val_loss: 0.0331 - val_mean_absolute_error: 0.1354
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0754 - mean_absolute_error: 0.1838 - val_loss: 0.0321 - val_mean_absolute_error: 0.1298
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0824 - mean_absolute_error: 0.2056 - val_loss: 0.0312 - val_mean_absolute_error: 0.1301
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0677 - mean_absolute_error: 0.1646 - val_loss: 0.0300 - val_mean_absolute_error: 0.1246
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0545 - mean_absolute_error: 0.1641 - val_loss: 0.0302 - val_mean_absolute_error: 0.1237
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0731 - mean_absolute_error: 0.1764 - val_loss: 0.0410 - val_mean_absolute_error: 0.1384
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0573 - mean_absolute_error: 0.1679 - val_loss: 0.0421 - val_mean_absolute_error: 0.1488
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0606 - mean_absolute_error: 0.1673 - val_loss: 0.0361 - val_mean_absolute_error: 0.1454
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0771 - mean_absolute_error: 0.1855 - val_loss: 0.0378 - val_mean_absolute_error: 0.1407
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0697 - mean_absolute_error: 0.2053 - val_loss: 0.0413 - val_mean_absolute_error: 0.1536
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0488 - mean_absolute_error: 0.1531 - val_loss: 0.0425 - val_mean_absolute_error: 0.1538
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0913 - mean_absolute_error: 0.2029 - val_loss: 0.0372 - val_mean_absolute_error: 0.1409
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0664 - mean_absolute_error: 0.1623 - val_loss: 0.0302 - val_mean_absolute_error: 0.1193
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0708 - mean_absolute_error: 0.1895 - val_loss: 0.0285 - val_mean_absolute_error: 0.1163
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1037 - mean_absolute_error: 0.2016 - val_loss: 0.0302 - val_mean_absolute_error: 0.1227
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0684 - mean_absolute_error: 0.1768 - val_loss: 0.0282 - val_mean_absolute_error: 0.1213
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0704 - mean_absolute_error: 0.1926 - val_loss: 0.0324 - val_mean_absolute_error: 0.1275
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0767 - mean_absolute_error: 0.1877 - val_loss: 0.0277 - val_mean_absolute_error: 0.1216
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0524 - mean_absolute_error: 0.1683 - val_loss: 0.0309 - val_mean_absolute_error: 0.1096
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0949 - mean_absolute_error: 0.2063 - val_loss: 0.0291 - val_mean_absolute_error: 0.1078
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0534 - mean_absolute_error: 0.1692 - val_loss: 0.0288 - val_mean_absolute_error: 0.1273
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0560 - mean_absolute_error: 0.1595 - val_loss: 0.0346 - val_mean_absolute_error: 0.1391
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0630 - mean_absolute_error: 0.1714 - val_loss: 0.0300 - val_mean_absolute_error: 0.1249
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0665 - mean_absolute_error: 0.1662 - val_loss: 0.0311 - val_mean_absolute_error: 0.1165
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0757 - mean_absolute_error: 0.1887 - val_loss: 0.0310 - val_mean_absolute_error: 0.1156
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0506 - mean_absolute_error: 0.1601 - val_loss: 0.0263 - val_mean_absolute_error: 0.1096
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0427 - mean_absolute_error: 0.1570 - val_loss: 0.0276 - val_mean_absolute_error: 0.1142
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0737 - mean_absolute_error: 0.1793 - val_loss: 0.0309 - val_mean_absolute_error: 0.1205

FOLD 4
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - loss: 0.0731 - mean_absolute_error: 0.1899 - val_loss: 0.0181 - val_mean_absolute_error: 0.0855
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0648 - mean_absolute_error: 0.1765 - val_loss: 0.0171 - val_mean_absolute_error: 0.0847
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0493 - mean_absolute_error: 0.1639 - val_loss: 0.0196 - val_mean_absolute_error: 0.0982
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0679 - mean_absolute_error: 0.1753 - val_loss: 0.0248 - val_mean_absolute_error: 0.1138
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0437 - mean_absolute_error: 0.1512 - val_loss: 0.0225 - val_mean_absolute_error: 0.1098
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0532 - mean_absolute_error: 0.1721 - val_loss: 0.0234 - val_mean_absolute_error: 0.1108
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0549 - mean_absolute_error: 0.1629 - val_loss: 0.0171 - val_mean_absolute_error: 0.0842
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0787 - mean_absolute_error: 0.1878 - val_loss: 0.0143 - val_mean_absolute_error: 0.0733
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0753 - mean_absolute_error: 0.1959 - val_loss: 0.0210 - val_mean_absolute_error: 0.0998
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0754 - mean_absolute_error: 0.1933 - val_loss: 0.0233 - val_mean_absolute_error: 0.1061
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0558 - mean_absolute_error: 0.1573 - val_loss: 0.0200 - val_mean_absolute_error: 0.1025
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0514 - mean_absolute_error: 0.1543 - val_loss: 0.0179 - val_mean_absolute_error: 0.0989
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0659 - mean_absolute_error: 0.1743 - val_loss: 0.0177 - val_mean_absolute_error: 0.0977
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0505 - mean_absolute_error: 0.1575 - val_loss: 0.0228 - val_mean_absolute_error: 0.1093
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0468 - mean_absolute_error: 0.1518 - val_loss: 0.0211 - val_mean_absolute_error: 0.1015
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0525 - mean_absolute_error: 0.1589 - val_loss: 0.0196 - val_mean_absolute_error: 0.0943
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0562 - mean_absolute_error: 0.1623 - val_loss: 0.0192 - val_mean_absolute_error: 0.0953
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0377 - mean_absolute_error: 0.1289 - val_loss: 0.0216 - val_mean_absolute_error: 0.1033
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0610 - mean_absolute_error: 0.1583 - val_loss: 0.0217 - val_mean_absolute_error: 0.1071
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0591 - mean_absolute_error: 0.1684 - val_loss: 0.0182 - val_mean_absolute_error: 0.0932
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0675 - mean_absolute_error: 0.1816 - val_loss: 0.0190 - val_mean_absolute_error: 0.0936
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0531 - mean_absolute_error: 0.1661 - val_loss: 0.0223 - val_mean_absolute_error: 0.1085
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0449 - mean_absolute_error: 0.1453 - val_loss: 0.0224 - val_mean_absolute_error: 0.1129
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0386 - mean_absolute_error: 0.1349 - val_loss: 0.0203 - val_mean_absolute_error: 0.1048
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0456 - mean_absolute_error: 0.1462 - val_loss: 0.0178 - val_mean_absolute_error: 0.0939
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0770 - mean_absolute_error: 0.1920 - val_loss: 0.0183 - val_mean_absolute_error: 0.0980
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0709 - mean_absolute_error: 0.1807 - val_loss: 0.0207 - val_mean_absolute_error: 0.1092
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0476 - mean_absolute_error: 0.1419 - val_loss: 0.0325 - val_mean_absolute_error: 0.1396
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0674 - mean_absolute_error: 0.1752 - val_loss: 0.0383 - val_mean_absolute_error: 0.1475
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0447 - mean_absolute_error: 0.1468 - val_loss: 0.0335 - val_mean_absolute_error: 0.1397
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0462 - mean_absolute_error: 0.1438 - val_loss: 0.0289 - val_mean_absolute_error: 0.1240
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0475 - mean_absolute_error: 0.1434 - val_loss: 0.0260 - val_mean_absolute_error: 0.1200
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0411 - mean_absolute_error: 0.1461 - val_loss: 0.0254 - val_mean_absolute_error: 0.1211
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0774 - mean_absolute_error: 0.1977 - val_loss: 0.0312 - val_mean_absolute_error: 0.1315
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0743 - mean_absolute_error: 0.1708 - val_loss: 0.0369 - val_mean_absolute_error: 0.1436
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0305 - mean_absolute_error: 0.1228 - val_loss: 0.0379 - val_mean_absolute_error: 0.1463
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0548 - mean_absolute_error: 0.1613 - val_loss: 0.0387 - val_mean_absolute_error: 0.1468
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0453 - mean_absolute_error: 0.1394 - val_loss: 0.0384 - val_mean_absolute_error: 0.1443
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0432 - mean_absolute_error: 0.1555 - val_loss: 0.0317 - val_mean_absolute_error: 0.1352
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0475 - mean_absolute_error: 0.1440 - val_loss: 0.0298 - val_mean_absolute_error: 0.1335
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0513 - mean_absolute_error: 0.1481 - val_loss: 0.0315 - val_mean_absolute_error: 0.1388
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.0343 - mean_absolute_error: 0.1267 - val_loss: 0.0402 - val_mean_absolute_error: 0.1498
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.0537 - mean_absolute_error: 0.1808 - val_loss: 0.0427 - val_mean_absolute_error: 0.1531
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0666 - mean_absolute_error: 0.1797 - val_loss: 0.0332 - val_mean_absolute_error: 0.1394
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0508 - mean_absolute_error: 0.1408 - val_loss: 0.0341 - val_mean_absolute_error: 0.1432
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0535 - mean_absolute_error: 0.1632 - val_loss: 0.0263 - val_mean_absolute_error: 0.1181
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0391 - mean_absolute_error: 0.1362 - val_loss: 0.0212 - val_mean_absolute_error: 0.1009
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0410 - mean_absolute_error: 0.1470 - val_loss: 0.0258 - val_mean_absolute_error: 0.1102
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0437 - mean_absolute_error: 0.1433 - val_loss: 0.0304 - val_mean_absolute_error: 0.1234
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0546 - mean_absolute_error: 0.1685 - val_loss: 0.0317 - val_mean_absolute_error: 0.1317

FOLD 5
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - loss: 0.0443 - mean_absolute_error: 0.1546 - val_loss: 0.0329 - val_mean_absolute_error: 0.1281
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0385 - mean_absolute_error: 0.1351 - val_loss: 0.0334 - val_mean_absolute_error: 0.1227
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0333 - mean_absolute_error: 0.1260 - val_loss: 0.0342 - val_mean_absolute_error: 0.1246
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0377 - mean_absolute_error: 0.1365 - val_loss: 0.0369 - val_mean_absolute_error: 0.1421
Epoch 5/50
2025-08-09 17:03:27.001108: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:03:27.001459: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0571 - mean_absolute_error: 0.1532 - val_loss: 0.0410 - val_mean_absolute_error: 0.1562
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0333 - mean_absolute_error: 0.1252 - val_loss: 0.0390 - val_mean_absolute_error: 0.1504
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0464 - mean_absolute_error: 0.1428 - val_loss: 0.0323 - val_mean_absolute_error: 0.1280
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0364 - mean_absolute_error: 0.1365 - val_loss: 0.0287 - val_mean_absolute_error: 0.1190
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0404 - mean_absolute_error: 0.1440 - val_loss: 0.0301 - val_mean_absolute_error: 0.1294
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0570 - mean_absolute_error: 0.1617 - val_loss: 0.0379 - val_mean_absolute_error: 0.1487
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0583 - mean_absolute_error: 0.1441 - val_loss: 0.0392 - val_mean_absolute_error: 0.1506
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.0656 - mean_absolute_error: 0.1583 - val_loss: 0.0371 - val_mean_absolute_error: 0.1418
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0460 - mean_absolute_error: 0.1411 - val_loss: 0.0388 - val_mean_absolute_error: 0.1495
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0415 - mean_absolute_error: 0.1299 - val_loss: 0.0378 - val_mean_absolute_error: 0.1465
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0532 - mean_absolute_error: 0.1607 - val_loss: 0.0360 - val_mean_absolute_error: 0.1389
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0317 - mean_absolute_error: 0.1194 - val_loss: 0.0369 - val_mean_absolute_error: 0.1414
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0462 - mean_absolute_error: 0.1587 - val_loss: 0.0300 - val_mean_absolute_error: 0.1253
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0298 - mean_absolute_error: 0.1135 - val_loss: 0.0279 - val_mean_absolute_error: 0.1269
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0476 - mean_absolute_error: 0.1531 - val_loss: 0.0360 - val_mean_absolute_error: 0.1462
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0311 - mean_absolute_error: 0.1220 - val_loss: 0.0394 - val_mean_absolute_error: 0.1518
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0420 - mean_absolute_error: 0.1455 - val_loss: 0.0349 - val_mean_absolute_error: 0.1444
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0506 - mean_absolute_error: 0.1608 - val_loss: 0.0286 - val_mean_absolute_error: 0.1334
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0349 - mean_absolute_error: 0.1296 - val_loss: 0.0312 - val_mean_absolute_error: 0.1360
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0450 - mean_absolute_error: 0.1473 - val_loss: 0.0420 - val_mean_absolute_error: 0.1578
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0475 - mean_absolute_error: 0.1395 - val_loss: 0.0488 - val_mean_absolute_error: 0.1651
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0368 - mean_absolute_error: 0.1350 - val_loss: 0.0443 - val_mean_absolute_error: 0.1540
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0455 - mean_absolute_error: 0.1420 - val_loss: 0.0345 - val_mean_absolute_error: 0.1321
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0429 - mean_absolute_error: 0.1387 - val_loss: 0.0292 - val_mean_absolute_error: 0.1187
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0355 - mean_absolute_error: 0.1272 - val_loss: 0.0297 - val_mean_absolute_error: 0.1239
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0414 - mean_absolute_error: 0.1378 - val_loss: 0.0313 - val_mean_absolute_error: 0.1300
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0300 - mean_absolute_error: 0.1171 - val_loss: 0.0343 - val_mean_absolute_error: 0.1384
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0267 - mean_absolute_error: 0.1128 - val_loss: 0.0410 - val_mean_absolute_error: 0.1564
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0263 - mean_absolute_error: 0.1028 - val_loss: 0.0515 - val_mean_absolute_error: 0.1840
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0418 - mean_absolute_error: 0.1300 - val_loss: 0.0565 - val_mean_absolute_error: 0.1946
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0316 - mean_absolute_error: 0.1196 - val_loss: 0.0575 - val_mean_absolute_error: 0.1952
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0377 - mean_absolute_error: 0.1290 - val_loss: 0.0542 - val_mean_absolute_error: 0.1878
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0467 - mean_absolute_error: 0.1467 - val_loss: 0.0424 - val_mean_absolute_error: 0.1667
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0366 - mean_absolute_error: 0.1280 - val_loss: 0.0342 - val_mean_absolute_error: 0.1453
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0434 - mean_absolute_error: 0.1460 - val_loss: 0.0316 - val_mean_absolute_error: 0.1403
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0328 - mean_absolute_error: 0.1218 - val_loss: 0.0331 - val_mean_absolute_error: 0.1453
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0631 - mean_absolute_error: 0.1667 - val_loss: 0.0340 - val_mean_absolute_error: 0.1449
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0291 - mean_absolute_error: 0.1205 - val_loss: 0.0307 - val_mean_absolute_error: 0.1390
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0352 - mean_absolute_error: 0.1224 - val_loss: 0.0308 - val_mean_absolute_error: 0.1373
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0335 - mean_absolute_error: 0.1250 - val_loss: 0.0356 - val_mean_absolute_error: 0.1526
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0283 - mean_absolute_error: 0.1160 - val_loss: 0.0435 - val_mean_absolute_error: 0.1707
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0348 - mean_absolute_error: 0.1160 - val_loss: 0.0439 - val_mean_absolute_error: 0.1716
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0350 - mean_absolute_error: 0.1208 - val_loss: 0.0417 - val_mean_absolute_error: 0.1652
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0247 - mean_absolute_error: 0.1035 - val_loss: 0.0425 - val_mean_absolute_error: 0.1647
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0533 - mean_absolute_error: 0.1635 - val_loss: 0.0399 - val_mean_absolute_error: 0.1593
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0306 - mean_absolute_error: 0.1221 - val_loss: 0.0329 - val_mean_absolute_error: 0.1434
Validation losses: [0.28208214044570923, 0.09310527890920639, 0.030914682894945145, 0.03165797144174576, 0.03292366489768028]
HPS: {'player_emb_dim': 16, 'dense_units': 96, 'dense_units_2': 32, 'learning_rate': 0.01, 'dropout_rate': 0.2}. MSE during RandomSearch: 0.20530542731285095. Starting evaluation across all k folds...

FOLD 1
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 2s 91ms/step - loss: 1.0530 - mean_absolute_error: 0.8047 - val_loss: 1.3603 - val_mean_absolute_error: 0.9594
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 1.0936 - mean_absolute_error: 0.8291 - val_loss: 0.8600 - val_mean_absolute_error: 0.7386
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.5897 - mean_absolute_error: 0.5859 - val_loss: 0.7540 - val_mean_absolute_error: 0.6783
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.4791 - mean_absolute_error: 0.5365 - val_loss: 0.6341 - val_mean_absolute_error: 0.5913
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.3476 - mean_absolute_error: 0.4421 - val_loss: 0.6431 - val_mean_absolute_error: 0.6426
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.3946 - mean_absolute_error: 0.4574 - val_loss: 0.4583 - val_mean_absolute_error: 0.5057
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.2859 - mean_absolute_error: 0.4027 - val_loss: 0.6036 - val_mean_absolute_error: 0.6225
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.2343 - mean_absolute_error: 0.3703 - val_loss: 0.4796 - val_mean_absolute_error: 0.5520
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1862 - mean_absolute_error: 0.3411 - val_loss: 0.2954 - val_mean_absolute_error: 0.4324
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.2077 - mean_absolute_error: 0.3473 - val_loss: 0.3766 - val_mean_absolute_error: 0.4905
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.2338 - mean_absolute_error: 0.3428 - val_loss: 0.3986 - val_mean_absolute_error: 0.5005
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1796 - mean_absolute_error: 0.3219 - val_loss: 0.3177 - val_mean_absolute_error: 0.4255
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1617 - mean_absolute_error: 0.2937 - val_loss: 0.2558 - val_mean_absolute_error: 0.3849
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1359 - mean_absolute_error: 0.2631 - val_loss: 0.3250 - val_mean_absolute_error: 0.4540
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1357 - mean_absolute_error: 0.2701 - val_loss: 0.3121 - val_mean_absolute_error: 0.4394
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1614 - mean_absolute_error: 0.2943 - val_loss: 0.2737 - val_mean_absolute_error: 0.4179
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1356 - mean_absolute_error: 0.2738 - val_loss: 0.3202 - val_mean_absolute_error: 0.4574
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1223 - mean_absolute_error: 0.2582 - val_loss: 0.2702 - val_mean_absolute_error: 0.3997
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1205 - mean_absolute_error: 0.2607 - val_loss: 0.2452 - val_mean_absolute_error: 0.3756
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1062 - mean_absolute_error: 0.2359 - val_loss: 0.2604 - val_mean_absolute_error: 0.3863
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1259 - mean_absolute_error: 0.2462 - val_loss: 0.2564 - val_mean_absolute_error: 0.3793
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0891 - mean_absolute_error: 0.2082 - val_loss: 0.2333 - val_mean_absolute_error: 0.3634
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1276 - mean_absolute_error: 0.2408 - val_loss: 0.2345 - val_mean_absolute_error: 0.3688
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1199 - mean_absolute_error: 0.2383 - val_loss: 0.3527 - val_mean_absolute_error: 0.4901
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1541 - mean_absolute_error: 0.2900 - val_loss: 0.3213 - val_mean_absolute_error: 0.4381
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1528 - mean_absolute_error: 0.2907 - val_loss: 0.2514 - val_mean_absolute_error: 0.4153
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1122 - mean_absolute_error: 0.2532 - val_loss: 0.2652 - val_mean_absolute_error: 0.3996
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1173 - mean_absolute_error: 0.2472 - val_loss: 0.2626 - val_mean_absolute_error: 0.3858
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1089 - mean_absolute_error: 0.2305 - val_loss: 0.2215 - val_mean_absolute_error: 0.3638
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0915 - mean_absolute_error: 0.2185 - val_loss: 0.1970 - val_mean_absolute_error: 0.3553
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1135 - mean_absolute_error: 0.2550 - val_loss: 0.1624 - val_mean_absolute_error: 0.2959
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0995 - mean_absolute_error: 0.2166 - val_loss: 0.1586 - val_mean_absolute_error: 0.2739
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0662 - mean_absolute_error: 0.1789 - val_loss: 0.1690 - val_mean_absolute_error: 0.2881
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1068 - mean_absolute_error: 0.2184 - val_loss: 0.1833 - val_mean_absolute_error: 0.3241
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1136 - mean_absolute_error: 0.2221 - val_loss: 0.2231 - val_mean_absolute_error: 0.3673
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0922 - mean_absolute_error: 0.2157 - val_loss: 0.1988 - val_mean_absolute_error: 0.3549
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1298 - mean_absolute_error: 0.2662 - val_loss: 0.1963 - val_mean_absolute_error: 0.3371
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1269 - mean_absolute_error: 0.2511 - val_loss: 0.1869 - val_mean_absolute_error: 0.3523
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0743 - mean_absolute_error: 0.2010 - val_loss: 0.1815 - val_mean_absolute_error: 0.3523
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0817 - mean_absolute_error: 0.2062 - val_loss: 0.1757 - val_mean_absolute_error: 0.3403
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0962 - mean_absolute_error: 0.2241 - val_loss: 0.2096 - val_mean_absolute_error: 0.3795
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0812 - mean_absolute_error: 0.1985 - val_loss: 0.2046 - val_mean_absolute_error: 0.3676
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0681 - mean_absolute_error: 0.1851 - val_loss: 0.1583 - val_mean_absolute_error: 0.3069
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1086 - mean_absolute_error: 0.2344 - val_loss: 0.1519 - val_mean_absolute_error: 0.2984
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0807 - mean_absolute_error: 0.1951 - val_loss: 0.1556 - val_mean_absolute_error: 0.3219
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0664 - mean_absolute_error: 0.1681 - val_loss: 0.1757 - val_mean_absolute_error: 0.3475
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0764 - mean_absolute_error: 0.1864 - val_loss: 0.1718 - val_mean_absolute_error: 0.3400
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0677 - mean_absolute_error: 0.1821 - val_loss: 0.1592 - val_mean_absolute_error: 0.3036
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1035 - mean_absolute_error: 0.2355 - val_loss: 0.1474 - val_mean_absolute_error: 0.2855
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0753 - mean_absolute_error: 0.1954 - val_loss: 0.1438 - val_mean_absolute_error: 0.2723

FOLD 2
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - loss: 0.0998 - mean_absolute_error: 0.2193 - val_loss: 0.0677 - val_mean_absolute_error: 0.1863
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1129 - mean_absolute_error: 0.2324 - val_loss: 0.0661 - val_mean_absolute_error: 0.1796
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0910 - mean_absolute_error: 0.2219 - val_loss: 0.0841 - val_mean_absolute_error: 0.2044
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0878 - mean_absolute_error: 0.2186 - val_loss: 0.0765 - val_mean_absolute_error: 0.1859
Epoch 5/50
2025-08-09 17:03:33.122816: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:03:33.123213: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1083 - mean_absolute_error: 0.2163 - val_loss: 0.0691 - val_mean_absolute_error: 0.1680
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1151 - mean_absolute_error: 0.2234 - val_loss: 0.0492 - val_mean_absolute_error: 0.1601
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1020 - mean_absolute_error: 0.2089 - val_loss: 0.0561 - val_mean_absolute_error: 0.1758
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0899 - mean_absolute_error: 0.2092 - val_loss: 0.0714 - val_mean_absolute_error: 0.1822
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0996 - mean_absolute_error: 0.2077 - val_loss: 0.0902 - val_mean_absolute_error: 0.2229
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1167 - mean_absolute_error: 0.2261 - val_loss: 0.0649 - val_mean_absolute_error: 0.1852
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0825 - mean_absolute_error: 0.2145 - val_loss: 0.0718 - val_mean_absolute_error: 0.2036
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0975 - mean_absolute_error: 0.2086 - val_loss: 0.0620 - val_mean_absolute_error: 0.1938
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1020 - mean_absolute_error: 0.2279 - val_loss: 0.0608 - val_mean_absolute_error: 0.1692
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1211 - mean_absolute_error: 0.2468 - val_loss: 0.0634 - val_mean_absolute_error: 0.1959
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0929 - mean_absolute_error: 0.2065 - val_loss: 0.0769 - val_mean_absolute_error: 0.2071
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0794 - mean_absolute_error: 0.1838 - val_loss: 0.0370 - val_mean_absolute_error: 0.1262
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0676 - mean_absolute_error: 0.1628 - val_loss: 0.0514 - val_mean_absolute_error: 0.1537
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0867 - mean_absolute_error: 0.1980 - val_loss: 0.0679 - val_mean_absolute_error: 0.1738
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0598 - mean_absolute_error: 0.1721 - val_loss: 0.1129 - val_mean_absolute_error: 0.2393
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1066 - mean_absolute_error: 0.2303 - val_loss: 0.0585 - val_mean_absolute_error: 0.1799
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0908 - mean_absolute_error: 0.2045 - val_loss: 0.0556 - val_mean_absolute_error: 0.1681
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0579 - mean_absolute_error: 0.1655 - val_loss: 0.0974 - val_mean_absolute_error: 0.2257
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0718 - mean_absolute_error: 0.1855 - val_loss: 0.1020 - val_mean_absolute_error: 0.2234
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1166 - mean_absolute_error: 0.2427 - val_loss: 0.0476 - val_mean_absolute_error: 0.1543
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0537 - mean_absolute_error: 0.1528 - val_loss: 0.0384 - val_mean_absolute_error: 0.1301
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0601 - mean_absolute_error: 0.1640 - val_loss: 0.0584 - val_mean_absolute_error: 0.1811
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0814 - mean_absolute_error: 0.1767 - val_loss: 0.0836 - val_mean_absolute_error: 0.1957
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0700 - mean_absolute_error: 0.1639 - val_loss: 0.0983 - val_mean_absolute_error: 0.2435
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1060 - mean_absolute_error: 0.2126 - val_loss: 0.0646 - val_mean_absolute_error: 0.1903
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0921 - mean_absolute_error: 0.2045 - val_loss: 0.0801 - val_mean_absolute_error: 0.2095
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0977 - mean_absolute_error: 0.1875 - val_loss: 0.1104 - val_mean_absolute_error: 0.2605
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0931 - mean_absolute_error: 0.2005 - val_loss: 0.0575 - val_mean_absolute_error: 0.1753
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0596 - mean_absolute_error: 0.1614 - val_loss: 0.0575 - val_mean_absolute_error: 0.1681
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0837 - mean_absolute_error: 0.1940 - val_loss: 0.0869 - val_mean_absolute_error: 0.2152
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0550 - mean_absolute_error: 0.1496 - val_loss: 0.1308 - val_mean_absolute_error: 0.2697
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0773 - mean_absolute_error: 0.1824 - val_loss: 0.0997 - val_mean_absolute_error: 0.2405
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0604 - mean_absolute_error: 0.1482 - val_loss: 0.0817 - val_mean_absolute_error: 0.2065
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0719 - mean_absolute_error: 0.1876 - val_loss: 0.0869 - val_mean_absolute_error: 0.2222
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0764 - mean_absolute_error: 0.1750 - val_loss: 0.0642 - val_mean_absolute_error: 0.2001
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0608 - mean_absolute_error: 0.1685 - val_loss: 0.0496 - val_mean_absolute_error: 0.1640
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0617 - mean_absolute_error: 0.1690 - val_loss: 0.0722 - val_mean_absolute_error: 0.2066
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0687 - mean_absolute_error: 0.1708 - val_loss: 0.0828 - val_mean_absolute_error: 0.2245
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0528 - mean_absolute_error: 0.1469 - val_loss: 0.0682 - val_mean_absolute_error: 0.1965
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0492 - mean_absolute_error: 0.1598 - val_loss: 0.0746 - val_mean_absolute_error: 0.2068
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0594 - mean_absolute_error: 0.1462 - val_loss: 0.0990 - val_mean_absolute_error: 0.2534
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0766 - mean_absolute_error: 0.1906 - val_loss: 0.1017 - val_mean_absolute_error: 0.2306
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0665 - mean_absolute_error: 0.1735 - val_loss: 0.0925 - val_mean_absolute_error: 0.2186
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0761 - mean_absolute_error: 0.1912 - val_loss: 0.0961 - val_mean_absolute_error: 0.2357
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0684 - mean_absolute_error: 0.1864 - val_loss: 0.1153 - val_mean_absolute_error: 0.2481
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0552 - mean_absolute_error: 0.1584 - val_loss: 0.1259 - val_mean_absolute_error: 0.2636

FOLD 3
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - loss: 0.0884 - mean_absolute_error: 0.2169 - val_loss: 0.0210 - val_mean_absolute_error: 0.0832
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0858 - mean_absolute_error: 0.2010 - val_loss: 0.0229 - val_mean_absolute_error: 0.0823
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0862 - mean_absolute_error: 0.1947 - val_loss: 0.0659 - val_mean_absolute_error: 0.1825
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1354 - mean_absolute_error: 0.2534 - val_loss: 0.0586 - val_mean_absolute_error: 0.1713
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1006 - mean_absolute_error: 0.2283 - val_loss: 0.0540 - val_mean_absolute_error: 0.1453
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1324 - mean_absolute_error: 0.2584 - val_loss: 0.0549 - val_mean_absolute_error: 0.1589
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1259 - mean_absolute_error: 0.2315 - val_loss: 0.0313 - val_mean_absolute_error: 0.1163
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0719 - mean_absolute_error: 0.1886 - val_loss: 0.0436 - val_mean_absolute_error: 0.1477
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0784 - mean_absolute_error: 0.1959 - val_loss: 0.0382 - val_mean_absolute_error: 0.1268
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0780 - mean_absolute_error: 0.1857 - val_loss: 0.0368 - val_mean_absolute_error: 0.1315
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0742 - mean_absolute_error: 0.1803 - val_loss: 0.0538 - val_mean_absolute_error: 0.1754
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0949 - mean_absolute_error: 0.2116 - val_loss: 0.0338 - val_mean_absolute_error: 0.1218
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0774 - mean_absolute_error: 0.1868 - val_loss: 0.0514 - val_mean_absolute_error: 0.1580
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0732 - mean_absolute_error: 0.1831 - val_loss: 0.0361 - val_mean_absolute_error: 0.1331
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0681 - mean_absolute_error: 0.1879 - val_loss: 0.0241 - val_mean_absolute_error: 0.0883
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0573 - mean_absolute_error: 0.1454 - val_loss: 0.0249 - val_mean_absolute_error: 0.1022
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0744 - mean_absolute_error: 0.1970 - val_loss: 0.0234 - val_mean_absolute_error: 0.0934
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0702 - mean_absolute_error: 0.1667 - val_loss: 0.0405 - val_mean_absolute_error: 0.1372
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0824 - mean_absolute_error: 0.1855 - val_loss: 0.0473 - val_mean_absolute_error: 0.1506
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0587 - mean_absolute_error: 0.1543 - val_loss: 0.0432 - val_mean_absolute_error: 0.1460
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0829 - mean_absolute_error: 0.2055 - val_loss: 0.0284 - val_mean_absolute_error: 0.1101
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0741 - mean_absolute_error: 0.1953 - val_loss: 0.0917 - val_mean_absolute_error: 0.2145
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1252 - mean_absolute_error: 0.2397 - val_loss: 0.0496 - val_mean_absolute_error: 0.1639
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0924 - mean_absolute_error: 0.2088 - val_loss: 0.0503 - val_mean_absolute_error: 0.1594
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0917 - mean_absolute_error: 0.2077 - val_loss: 0.0633 - val_mean_absolute_error: 0.1782
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0809 - mean_absolute_error: 0.1888 - val_loss: 0.0827 - val_mean_absolute_error: 0.2194
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1174 - mean_absolute_error: 0.2433 - val_loss: 0.0628 - val_mean_absolute_error: 0.1859
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0884 - mean_absolute_error: 0.1793 - val_loss: 0.0437 - val_mean_absolute_error: 0.1573
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0720 - mean_absolute_error: 0.1704 - val_loss: 0.0521 - val_mean_absolute_error: 0.1510
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0511 - mean_absolute_error: 0.1514 - val_loss: 0.0493 - val_mean_absolute_error: 0.1391
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0752 - mean_absolute_error: 0.1647 - val_loss: 0.0405 - val_mean_absolute_error: 0.1343
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0629 - mean_absolute_error: 0.1669 - val_loss: 0.0401 - val_mean_absolute_error: 0.1277
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0738 - mean_absolute_error: 0.1790 - val_loss: 0.0656 - val_mean_absolute_error: 0.1739
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0727 - mean_absolute_error: 0.1779 - val_loss: 0.0667 - val_mean_absolute_error: 0.1779
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0542 - mean_absolute_error: 0.1598 - val_loss: 0.0372 - val_mean_absolute_error: 0.1229
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0995 - mean_absolute_error: 0.1936 - val_loss: 0.0503 - val_mean_absolute_error: 0.1455
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0668 - mean_absolute_error: 0.1594 - val_loss: 0.0545 - val_mean_absolute_error: 0.1630
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0892 - mean_absolute_error: 0.1847 - val_loss: 0.0562 - val_mean_absolute_error: 0.1771
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0788 - mean_absolute_error: 0.1893 - val_loss: 0.0748 - val_mean_absolute_error: 0.1936
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.0829 - mean_absolute_error: 0.1857 - val_loss: 0.1276 - val_mean_absolute_error: 0.2451
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1003 - mean_absolute_error: 0.2355 - val_loss: 0.0965 - val_mean_absolute_error: 0.2328
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1024 - mean_absolute_error: 0.2007 - val_loss: 0.0777 - val_mean_absolute_error: 0.2170
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0684 - mean_absolute_error: 0.1815 - val_loss: 0.0775 - val_mean_absolute_error: 0.1877
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0756 - mean_absolute_error: 0.1813 - val_loss: 0.0968 - val_mean_absolute_error: 0.2269
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0983 - mean_absolute_error: 0.2275 - val_loss: 0.0545 - val_mean_absolute_error: 0.1631
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0647 - mean_absolute_error: 0.1821 - val_loss: 0.0372 - val_mean_absolute_error: 0.1219
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0513 - mean_absolute_error: 0.1604 - val_loss: 0.0345 - val_mean_absolute_error: 0.1286
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0867 - mean_absolute_error: 0.2000 - val_loss: 0.0582 - val_mean_absolute_error: 0.1665
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0700 - mean_absolute_error: 0.1690 - val_loss: 0.0527 - val_mean_absolute_error: 0.1481
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0512 - mean_absolute_error: 0.1654 - val_loss: 0.0510 - val_mean_absolute_error: 0.1372

FOLD 4
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - loss: 0.0974 - mean_absolute_error: 0.2024 - val_loss: 0.0182 - val_mean_absolute_error: 0.0714
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0697 - mean_absolute_error: 0.1892 - val_loss: 0.0335 - val_mean_absolute_error: 0.1309
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0726 - mean_absolute_error: 0.1809 - val_loss: 0.0258 - val_mean_absolute_error: 0.1036
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0979 - mean_absolute_error: 0.1999 - val_loss: 0.0229 - val_mean_absolute_error: 0.0886
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0752 - mean_absolute_error: 0.1911 - val_loss: 0.0280 - val_mean_absolute_error: 0.1045
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1002 - mean_absolute_error: 0.1996 - val_loss: 0.0218 - val_mean_absolute_error: 0.0867
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0620 - mean_absolute_error: 0.1614 - val_loss: 0.0551 - val_mean_absolute_error: 0.1721
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0956 - mean_absolute_error: 0.2337 - val_loss: 0.0308 - val_mean_absolute_error: 0.1112
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0814 - mean_absolute_error: 0.1926 - val_loss: 0.0289 - val_mean_absolute_error: 0.1078
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0725 - mean_absolute_error: 0.1978 - val_loss: 0.0262 - val_mean_absolute_error: 0.1084
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0645 - mean_absolute_error: 0.1801 - val_loss: 0.0241 - val_mean_absolute_error: 0.1032
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0655 - mean_absolute_error: 0.1719 - val_loss: 0.0271 - val_mean_absolute_error: 0.1080
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0761 - mean_absolute_error: 0.1783 - val_loss: 0.0347 - val_mean_absolute_error: 0.1270
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0978 - mean_absolute_error: 0.2094 - val_loss: 0.0414 - val_mean_absolute_error: 0.1501
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1135 - mean_absolute_error: 0.2353 - val_loss: 0.0350 - val_mean_absolute_error: 0.1224
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1053 - mean_absolute_error: 0.2375 - val_loss: 0.0304 - val_mean_absolute_error: 0.1019
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0795 - mean_absolute_error: 0.1782 - val_loss: 0.0263 - val_mean_absolute_error: 0.0990
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0792 - mean_absolute_error: 0.1978 - val_loss: 0.0195 - val_mean_absolute_error: 0.0711
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0763 - mean_absolute_error: 0.1873 - val_loss: 0.0361 - val_mean_absolute_error: 0.1214
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1013 - mean_absolute_error: 0.2255 - val_loss: 0.0381 - val_mean_absolute_error: 0.1329
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1074 - mean_absolute_error: 0.2048 - val_loss: 0.0499 - val_mean_absolute_error: 0.1773
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0829 - mean_absolute_error: 0.2153 - val_loss: 0.0326 - val_mean_absolute_error: 0.1228
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1099 - mean_absolute_error: 0.2193 - val_loss: 0.0551 - val_mean_absolute_error: 0.1710
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0974 - mean_absolute_error: 0.2213 - val_loss: 0.0388 - val_mean_absolute_error: 0.1500
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0565 - mean_absolute_error: 0.1559 - val_loss: 0.0336 - val_mean_absolute_error: 0.1220
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0657 - mean_absolute_error: 0.1808 - val_loss: 0.0289 - val_mean_absolute_error: 0.1140
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0582 - mean_absolute_error: 0.1580 - val_loss: 0.0506 - val_mean_absolute_error: 0.1665
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0813 - mean_absolute_error: 0.1886 - val_loss: 0.0299 - val_mean_absolute_error: 0.1217
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0567 - mean_absolute_error: 0.1604 - val_loss: 0.0246 - val_mean_absolute_error: 0.1023
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0558 - mean_absolute_error: 0.1575 - val_loss: 0.0286 - val_mean_absolute_error: 0.1175
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0700 - mean_absolute_error: 0.1871 - val_loss: 0.0446 - val_mean_absolute_error: 0.1550
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0553 - mean_absolute_error: 0.1551 - val_loss: 0.0283 - val_mean_absolute_error: 0.1123
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0511 - mean_absolute_error: 0.1461 - val_loss: 0.0413 - val_mean_absolute_error: 0.1333
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0759 - mean_absolute_error: 0.1874 - val_loss: 0.0449 - val_mean_absolute_error: 0.1433
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0467 - mean_absolute_error: 0.1471 - val_loss: 0.0474 - val_mean_absolute_error: 0.1553
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0749 - mean_absolute_error: 0.1836 - val_loss: 0.0346 - val_mean_absolute_error: 0.1259
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0481 - mean_absolute_error: 0.1468 - val_loss: 0.0375 - val_mean_absolute_error: 0.1307
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0658 - mean_absolute_error: 0.1633 - val_loss: 0.0324 - val_mean_absolute_error: 0.1137
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0701 - mean_absolute_error: 0.1876 - val_loss: 0.0364 - val_mean_absolute_error: 0.1271
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0566 - mean_absolute_error: 0.1667 - val_loss: 0.0408 - val_mean_absolute_error: 0.1462
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0526 - mean_absolute_error: 0.1536 - val_loss: 0.0281 - val_mean_absolute_error: 0.1097
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0545 - mean_absolute_error: 0.1513 - val_loss: 0.0368 - val_mean_absolute_error: 0.1272
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0871 - mean_absolute_error: 0.1962 - val_loss: 0.0688 - val_mean_absolute_error: 0.2069
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0865 - mean_absolute_error: 0.2010 - val_loss: 0.0469 - val_mean_absolute_error: 0.1525
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0918 - mean_absolute_error: 0.2192 - val_loss: 0.0493 - val_mean_absolute_error: 0.1705
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0594 - mean_absolute_error: 0.1674 - val_loss: 0.0547 - val_mean_absolute_error: 0.1634
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0860 - mean_absolute_error: 0.1925 - val_loss: 0.0401 - val_mean_absolute_error: 0.1290
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0796 - mean_absolute_error: 0.1903 - val_loss: 0.0345 - val_mean_absolute_error: 0.1278
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0863 - mean_absolute_error: 0.1947 - val_loss: 0.0356 - val_mean_absolute_error: 0.1345
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0715 - mean_absolute_error: 0.1668 - val_loss: 0.0327 - val_mean_absolute_error: 0.1165

FOLD 5
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - loss: 0.0573 - mean_absolute_error: 0.1455 - val_loss: 0.0356 - val_mean_absolute_error: 0.1369
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0580 - mean_absolute_error: 0.1646 - val_loss: 0.0224 - val_mean_absolute_error: 0.0926
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0457 - mean_absolute_error: 0.1487 - val_loss: 0.0239 - val_mean_absolute_error: 0.0824
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0709 - mean_absolute_error: 0.1738 - val_loss: 0.0319 - val_mean_absolute_error: 0.1053
Epoch 5/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 12ms/step - loss: 0.0587 - mean_absolute_error: 0.1509
2025-08-09 17:03:39.620732: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:03:39.621131: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0652 - mean_absolute_error: 0.1711 - val_loss: 0.0356 - val_mean_absolute_error: 0.1221
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0570 - mean_absolute_error: 0.1613 - val_loss: 0.0631 - val_mean_absolute_error: 0.1709
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0674 - mean_absolute_error: 0.1819 - val_loss: 0.0193 - val_mean_absolute_error: 0.0702
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0554 - mean_absolute_error: 0.1515 - val_loss: 0.0168 - val_mean_absolute_error: 0.0668
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0497 - mean_absolute_error: 0.1319 - val_loss: 0.0203 - val_mean_absolute_error: 0.0825
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0395 - mean_absolute_error: 0.1345 - val_loss: 0.0269 - val_mean_absolute_error: 0.1097
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0679 - mean_absolute_error: 0.1572 - val_loss: 0.0476 - val_mean_absolute_error: 0.1664
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0570 - mean_absolute_error: 0.1482 - val_loss: 0.0375 - val_mean_absolute_error: 0.1392
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0691 - mean_absolute_error: 0.1639 - val_loss: 0.0283 - val_mean_absolute_error: 0.1221
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0659 - mean_absolute_error: 0.1687 - val_loss: 0.0243 - val_mean_absolute_error: 0.0891
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0460 - mean_absolute_error: 0.1486 - val_loss: 0.0148 - val_mean_absolute_error: 0.0558
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0522 - mean_absolute_error: 0.1396 - val_loss: 0.0477 - val_mean_absolute_error: 0.1703
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0593 - mean_absolute_error: 0.1646 - val_loss: 0.0360 - val_mean_absolute_error: 0.1395
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0598 - mean_absolute_error: 0.1507 - val_loss: 0.0313 - val_mean_absolute_error: 0.1220
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0590 - mean_absolute_error: 0.1667 - val_loss: 0.0240 - val_mean_absolute_error: 0.1022
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0534 - mean_absolute_error: 0.1525 - val_loss: 0.0433 - val_mean_absolute_error: 0.1446
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0959 - mean_absolute_error: 0.1793 - val_loss: 0.0330 - val_mean_absolute_error: 0.1246
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0528 - mean_absolute_error: 0.1410 - val_loss: 0.0277 - val_mean_absolute_error: 0.1125
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0613 - mean_absolute_error: 0.1583 - val_loss: 0.0383 - val_mean_absolute_error: 0.1427
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0547 - mean_absolute_error: 0.1543 - val_loss: 0.0434 - val_mean_absolute_error: 0.1368
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0397 - mean_absolute_error: 0.1291 - val_loss: 0.0534 - val_mean_absolute_error: 0.1802
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0525 - mean_absolute_error: 0.1593 - val_loss: 0.0403 - val_mean_absolute_error: 0.1484
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0391 - mean_absolute_error: 0.1326 - val_loss: 0.0237 - val_mean_absolute_error: 0.0916
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0680 - mean_absolute_error: 0.1586 - val_loss: 0.0213 - val_mean_absolute_error: 0.0862
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0506 - mean_absolute_error: 0.1454 - val_loss: 0.0425 - val_mean_absolute_error: 0.1413
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1087 - mean_absolute_error: 0.2019 - val_loss: 0.0270 - val_mean_absolute_error: 0.1046
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0424 - mean_absolute_error: 0.1425 - val_loss: 0.0260 - val_mean_absolute_error: 0.0986
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0475 - mean_absolute_error: 0.1402 - val_loss: 0.0309 - val_mean_absolute_error: 0.1151
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0385 - mean_absolute_error: 0.1301 - val_loss: 0.0388 - val_mean_absolute_error: 0.1376
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0508 - mean_absolute_error: 0.1376 - val_loss: 0.0227 - val_mean_absolute_error: 0.0937
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0596 - mean_absolute_error: 0.1481 - val_loss: 0.0352 - val_mean_absolute_error: 0.1268
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0569 - mean_absolute_error: 0.1610 - val_loss: 0.0375 - val_mean_absolute_error: 0.1413
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0482 - mean_absolute_error: 0.1549 - val_loss: 0.0298 - val_mean_absolute_error: 0.1167
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0329 - mean_absolute_error: 0.1273 - val_loss: 0.0287 - val_mean_absolute_error: 0.1202
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0517 - mean_absolute_error: 0.1550 - val_loss: 0.0252 - val_mean_absolute_error: 0.1076
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0332 - mean_absolute_error: 0.1232 - val_loss: 0.0321 - val_mean_absolute_error: 0.1223
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0386 - mean_absolute_error: 0.1322 - val_loss: 0.0280 - val_mean_absolute_error: 0.1001
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0397 - mean_absolute_error: 0.1296 - val_loss: 0.0259 - val_mean_absolute_error: 0.1044
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0544 - mean_absolute_error: 0.1336 - val_loss: 0.0281 - val_mean_absolute_error: 0.1116
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0413 - mean_absolute_error: 0.1292 - val_loss: 0.0272 - val_mean_absolute_error: 0.1061
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0486 - mean_absolute_error: 0.1470 - val_loss: 0.0259 - val_mean_absolute_error: 0.0964
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0721 - mean_absolute_error: 0.1685 - val_loss: 0.0614 - val_mean_absolute_error: 0.1879
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0861 - mean_absolute_error: 0.1847 - val_loss: 0.0470 - val_mean_absolute_error: 0.1466
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0398 - mean_absolute_error: 0.1396 - val_loss: 0.0339 - val_mean_absolute_error: 0.1243
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0476 - mean_absolute_error: 0.1497 - val_loss: 0.0290 - val_mean_absolute_error: 0.1171
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0577 - mean_absolute_error: 0.1558 - val_loss: 0.0300 - val_mean_absolute_error: 0.1161
Validation losses: [0.14380395412445068, 0.1258540004491806, 0.05096049979329109, 0.032678429037332535, 0.02999122440814972]
HPS: {'player_emb_dim': 16, 'dense_units': 96, 'dense_units_2': 80, 'learning_rate': 0.01, 'dropout_rate': 0.1}. MSE during RandomSearch: 0.261588990688324. Starting evaluation across all k folds...

FOLD 1
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 2s 91ms/step - loss: 1.2980 - mean_absolute_error: 0.9228 - val_loss: 0.8279 - val_mean_absolute_error: 0.7165
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.8476 - mean_absolute_error: 0.7463 - val_loss: 0.6156 - val_mean_absolute_error: 0.6273
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.6604 - mean_absolute_error: 0.6097 - val_loss: 0.9970 - val_mean_absolute_error: 0.8399
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.5067 - mean_absolute_error: 0.5542 - val_loss: 0.7279 - val_mean_absolute_error: 0.7339
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.3390 - mean_absolute_error: 0.4462 - val_loss: 0.5996 - val_mean_absolute_error: 0.6323
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.2504 - mean_absolute_error: 0.3935 - val_loss: 0.4932 - val_mean_absolute_error: 0.5594
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1798 - mean_absolute_error: 0.3309 - val_loss: 0.3481 - val_mean_absolute_error: 0.4477
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1442 - mean_absolute_error: 0.2728 - val_loss: 0.2417 - val_mean_absolute_error: 0.3937
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1400 - mean_absolute_error: 0.2630 - val_loss: 0.2522 - val_mean_absolute_error: 0.3739
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1000 - mean_absolute_error: 0.2278 - val_loss: 0.2664 - val_mean_absolute_error: 0.3905
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 143ms/step - loss: 0.1162 - mean_absolute_error: 0.2450 - val_loss: 0.2310 - val_mean_absolute_error: 0.3834
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.1008 - mean_absolute_error: 0.2158 - val_loss: 0.2470 - val_mean_absolute_error: 0.3878
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0901 - mean_absolute_error: 0.2056 - val_loss: 0.2786 - val_mean_absolute_error: 0.4135
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1036 - mean_absolute_error: 0.2281 - val_loss: 0.2663 - val_mean_absolute_error: 0.3999
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0637 - mean_absolute_error: 0.1648 - val_loss: 0.2321 - val_mean_absolute_error: 0.3591
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0880 - mean_absolute_error: 0.2054 - val_loss: 0.2498 - val_mean_absolute_error: 0.3983
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0765 - mean_absolute_error: 0.1798 - val_loss: 0.2866 - val_mean_absolute_error: 0.4324
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0595 - mean_absolute_error: 0.1547 - val_loss: 0.2189 - val_mean_absolute_error: 0.3691
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0817 - mean_absolute_error: 0.1918 - val_loss: 0.2046 - val_mean_absolute_error: 0.3524
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0571 - mean_absolute_error: 0.1503 - val_loss: 0.2726 - val_mean_absolute_error: 0.4110
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1111 - mean_absolute_error: 0.2214 - val_loss: 0.2315 - val_mean_absolute_error: 0.3820
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0793 - mean_absolute_error: 0.1968 - val_loss: 0.2213 - val_mean_absolute_error: 0.3386
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0867 - mean_absolute_error: 0.2065 - val_loss: 0.2874 - val_mean_absolute_error: 0.4169
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0822 - mean_absolute_error: 0.1869 - val_loss: 0.2646 - val_mean_absolute_error: 0.3900
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0845 - mean_absolute_error: 0.1930 - val_loss: 0.1736 - val_mean_absolute_error: 0.3061
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0787 - mean_absolute_error: 0.1772 - val_loss: 0.1671 - val_mean_absolute_error: 0.3184
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0583 - mean_absolute_error: 0.1545 - val_loss: 0.2753 - val_mean_absolute_error: 0.4278
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0933 - mean_absolute_error: 0.2116 - val_loss: 0.2374 - val_mean_absolute_error: 0.3885
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0465 - mean_absolute_error: 0.1305 - val_loss: 0.1930 - val_mean_absolute_error: 0.3280
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0771 - mean_absolute_error: 0.1928 - val_loss: 0.2082 - val_mean_absolute_error: 0.3472
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0465 - mean_absolute_error: 0.1298 - val_loss: 0.2916 - val_mean_absolute_error: 0.4130
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0903 - mean_absolute_error: 0.1973 - val_loss: 0.2310 - val_mean_absolute_error: 0.3713
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0318 - mean_absolute_error: 0.0975 - val_loss: 0.2224 - val_mean_absolute_error: 0.3454
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0613 - mean_absolute_error: 0.1547 - val_loss: 0.2103 - val_mean_absolute_error: 0.3439
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - loss: 0.0463 - mean_absolute_error: 0.1352 - val_loss: 0.2123 - val_mean_absolute_error: 0.3631
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0609 - mean_absolute_error: 0.1367 - val_loss: 0.1862 - val_mean_absolute_error: 0.3319
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0372 - mean_absolute_error: 0.1151 - val_loss: 0.1758 - val_mean_absolute_error: 0.3050
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0436 - mean_absolute_error: 0.1318 - val_loss: 0.1831 - val_mean_absolute_error: 0.3335
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0451 - mean_absolute_error: 0.1257 - val_loss: 0.1854 - val_mean_absolute_error: 0.3375
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0420 - mean_absolute_error: 0.1195 - val_loss: 0.1857 - val_mean_absolute_error: 0.3297
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0471 - mean_absolute_error: 0.1349 - val_loss: 0.1878 - val_mean_absolute_error: 0.3290
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0560 - mean_absolute_error: 0.1479 - val_loss: 0.1920 - val_mean_absolute_error: 0.3221
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0428 - mean_absolute_error: 0.1184 - val_loss: 0.1904 - val_mean_absolute_error: 0.3264
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0445 - mean_absolute_error: 0.1256 - val_loss: 0.2213 - val_mean_absolute_error: 0.3668
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0559 - mean_absolute_error: 0.1435 - val_loss: 0.1997 - val_mean_absolute_error: 0.3392
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0496 - mean_absolute_error: 0.1386 - val_loss: 0.1866 - val_mean_absolute_error: 0.3083
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0512 - mean_absolute_error: 0.1381 - val_loss: 0.2145 - val_mean_absolute_error: 0.3599
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0388 - mean_absolute_error: 0.1152 - val_loss: 0.2444 - val_mean_absolute_error: 0.3832
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0497 - mean_absolute_error: 0.1427 - val_loss: 0.2121 - val_mean_absolute_error: 0.3259
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0495 - mean_absolute_error: 0.1452 - val_loss: 0.2258 - val_mean_absolute_error: 0.3352

FOLD 2
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - loss: 0.0974 - mean_absolute_error: 0.1993 - val_loss: 0.0248 - val_mean_absolute_error: 0.0819
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0579 - mean_absolute_error: 0.1509 - val_loss: 0.0387 - val_mean_absolute_error: 0.1283
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0728 - mean_absolute_error: 0.1770 - val_loss: 0.0539 - val_mean_absolute_error: 0.1726
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0558 - mean_absolute_error: 0.1703 - val_loss: 0.0891 - val_mean_absolute_error: 0.2190
Epoch 5/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 12ms/step - loss: 0.0602 - mean_absolute_error: 0.1571
2025-08-09 17:03:45.755662: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:03:45.755984: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0704 - mean_absolute_error: 0.1723 - val_loss: 0.0587 - val_mean_absolute_error: 0.1782
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0590 - mean_absolute_error: 0.1549 - val_loss: 0.0529 - val_mean_absolute_error: 0.1643
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0482 - mean_absolute_error: 0.1307 - val_loss: 0.0602 - val_mean_absolute_error: 0.1617
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0609 - mean_absolute_error: 0.1533 - val_loss: 0.0530 - val_mean_absolute_error: 0.1617
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0496 - mean_absolute_error: 0.1359 - val_loss: 0.0637 - val_mean_absolute_error: 0.1706
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0541 - mean_absolute_error: 0.1404 - val_loss: 0.0583 - val_mean_absolute_error: 0.1682
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0471 - mean_absolute_error: 0.1371 - val_loss: 0.0495 - val_mean_absolute_error: 0.1439
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0373 - mean_absolute_error: 0.1129 - val_loss: 0.0454 - val_mean_absolute_error: 0.1394
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0418 - mean_absolute_error: 0.1180 - val_loss: 0.0561 - val_mean_absolute_error: 0.1576
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0363 - mean_absolute_error: 0.1123 - val_loss: 0.0678 - val_mean_absolute_error: 0.1877
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0510 - mean_absolute_error: 0.1403 - val_loss: 0.0746 - val_mean_absolute_error: 0.1962
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0583 - mean_absolute_error: 0.1573 - val_loss: 0.0703 - val_mean_absolute_error: 0.1935
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0576 - mean_absolute_error: 0.1488 - val_loss: 0.0585 - val_mean_absolute_error: 0.1734
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0380 - mean_absolute_error: 0.1129 - val_loss: 0.0666 - val_mean_absolute_error: 0.1747
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0448 - mean_absolute_error: 0.1274 - val_loss: 0.0513 - val_mean_absolute_error: 0.1596
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0441 - mean_absolute_error: 0.1281 - val_loss: 0.0567 - val_mean_absolute_error: 0.1617
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0384 - mean_absolute_error: 0.1203 - val_loss: 0.0554 - val_mean_absolute_error: 0.1695
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0377 - mean_absolute_error: 0.1225 - val_loss: 0.0660 - val_mean_absolute_error: 0.1981
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0371 - mean_absolute_error: 0.1203 - val_loss: 0.0700 - val_mean_absolute_error: 0.1807
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0429 - mean_absolute_error: 0.1225 - val_loss: 0.0556 - val_mean_absolute_error: 0.1689
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0360 - mean_absolute_error: 0.1129 - val_loss: 0.0490 - val_mean_absolute_error: 0.1570
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0346 - mean_absolute_error: 0.1110 - val_loss: 0.0554 - val_mean_absolute_error: 0.1520
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0308 - mean_absolute_error: 0.0983 - val_loss: 0.0524 - val_mean_absolute_error: 0.1523
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0327 - mean_absolute_error: 0.1111 - val_loss: 0.0572 - val_mean_absolute_error: 0.1636
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0431 - mean_absolute_error: 0.1157 - val_loss: 0.0731 - val_mean_absolute_error: 0.1844
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0281 - mean_absolute_error: 0.0930 - val_loss: 0.0717 - val_mean_absolute_error: 0.1834
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0348 - mean_absolute_error: 0.1115 - val_loss: 0.0701 - val_mean_absolute_error: 0.1738
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0421 - mean_absolute_error: 0.1158 - val_loss: 0.0738 - val_mean_absolute_error: 0.1919
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0334 - mean_absolute_error: 0.0969 - val_loss: 0.0605 - val_mean_absolute_error: 0.1700
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0333 - mean_absolute_error: 0.1006 - val_loss: 0.0594 - val_mean_absolute_error: 0.1687
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0382 - mean_absolute_error: 0.1147 - val_loss: 0.0590 - val_mean_absolute_error: 0.1775
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0311 - mean_absolute_error: 0.1027 - val_loss: 0.0680 - val_mean_absolute_error: 0.1656
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0465 - mean_absolute_error: 0.1286 - val_loss: 0.0732 - val_mean_absolute_error: 0.1984
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0481 - mean_absolute_error: 0.1449 - val_loss: 0.0733 - val_mean_absolute_error: 0.2029
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0451 - mean_absolute_error: 0.1283 - val_loss: 0.0792 - val_mean_absolute_error: 0.1918
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0448 - mean_absolute_error: 0.1325 - val_loss: 0.0897 - val_mean_absolute_error: 0.2281
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0514 - mean_absolute_error: 0.1678 - val_loss: 0.0677 - val_mean_absolute_error: 0.1772
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0420 - mean_absolute_error: 0.1290 - val_loss: 0.0620 - val_mean_absolute_error: 0.1608
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0342 - mean_absolute_error: 0.1099 - val_loss: 0.0829 - val_mean_absolute_error: 0.2081
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0408 - mean_absolute_error: 0.1324 - val_loss: 0.0645 - val_mean_absolute_error: 0.1693
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0400 - mean_absolute_error: 0.1234 - val_loss: 0.0530 - val_mean_absolute_error: 0.1526
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0344 - mean_absolute_error: 0.1170 - val_loss: 0.0865 - val_mean_absolute_error: 0.2208
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0501 - mean_absolute_error: 0.1424 - val_loss: 0.0739 - val_mean_absolute_error: 0.1941
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0310 - mean_absolute_error: 0.1036 - val_loss: 0.0606 - val_mean_absolute_error: 0.1744
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0407 - mean_absolute_error: 0.1177 - val_loss: 0.0780 - val_mean_absolute_error: 0.1965
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0371 - mean_absolute_error: 0.1178 - val_loss: 0.0666 - val_mean_absolute_error: 0.1887

FOLD 3
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - loss: 0.0309 - mean_absolute_error: 0.1066 - val_loss: 0.0357 - val_mean_absolute_error: 0.1014
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0513 - mean_absolute_error: 0.1394 - val_loss: 0.0218 - val_mean_absolute_error: 0.0656
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0412 - mean_absolute_error: 0.1140 - val_loss: 0.0292 - val_mean_absolute_error: 0.0884
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0441 - mean_absolute_error: 0.1261 - val_loss: 0.0210 - val_mean_absolute_error: 0.0779
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0488 - mean_absolute_error: 0.1258 - val_loss: 0.0228 - val_mean_absolute_error: 0.0821
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0349 - mean_absolute_error: 0.1156 - val_loss: 0.0486 - val_mean_absolute_error: 0.1464
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0436 - mean_absolute_error: 0.1330 - val_loss: 0.0206 - val_mean_absolute_error: 0.0737
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0422 - mean_absolute_error: 0.1210 - val_loss: 0.0242 - val_mean_absolute_error: 0.0867
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0328 - mean_absolute_error: 0.1069 - val_loss: 0.0175 - val_mean_absolute_error: 0.0640
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0337 - mean_absolute_error: 0.1058 - val_loss: 0.0307 - val_mean_absolute_error: 0.0992
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0366 - mean_absolute_error: 0.1085 - val_loss: 0.0232 - val_mean_absolute_error: 0.0846
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0365 - mean_absolute_error: 0.1140 - val_loss: 0.0192 - val_mean_absolute_error: 0.0702
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0351 - mean_absolute_error: 0.1163 - val_loss: 0.0270 - val_mean_absolute_error: 0.0904
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0429 - mean_absolute_error: 0.1234 - val_loss: 0.0279 - val_mean_absolute_error: 0.0866
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0355 - mean_absolute_error: 0.1159 - val_loss: 0.0278 - val_mean_absolute_error: 0.0897
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0329 - mean_absolute_error: 0.1135 - val_loss: 0.0254 - val_mean_absolute_error: 0.1035
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0297 - mean_absolute_error: 0.1083 - val_loss: 0.0297 - val_mean_absolute_error: 0.1008
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0423 - mean_absolute_error: 0.1226 - val_loss: 0.0206 - val_mean_absolute_error: 0.0801
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0325 - mean_absolute_error: 0.1025 - val_loss: 0.0216 - val_mean_absolute_error: 0.0751
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0386 - mean_absolute_error: 0.1232 - val_loss: 0.0230 - val_mean_absolute_error: 0.0820
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0315 - mean_absolute_error: 0.1156 - val_loss: 0.0263 - val_mean_absolute_error: 0.1028
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0305 - mean_absolute_error: 0.1092 - val_loss: 0.0261 - val_mean_absolute_error: 0.0962
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0366 - mean_absolute_error: 0.1015 - val_loss: 0.0290 - val_mean_absolute_error: 0.1032
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0327 - mean_absolute_error: 0.1034 - val_loss: 0.0316 - val_mean_absolute_error: 0.1202
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0336 - mean_absolute_error: 0.1105 - val_loss: 0.0347 - val_mean_absolute_error: 0.1144
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0358 - mean_absolute_error: 0.1150 - val_loss: 0.0332 - val_mean_absolute_error: 0.1272
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0461 - mean_absolute_error: 0.1408 - val_loss: 0.0282 - val_mean_absolute_error: 0.0942
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0301 - mean_absolute_error: 0.1051 - val_loss: 0.0254 - val_mean_absolute_error: 0.0998
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0345 - mean_absolute_error: 0.1164 - val_loss: 0.0340 - val_mean_absolute_error: 0.1223
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0343 - mean_absolute_error: 0.1108 - val_loss: 0.0444 - val_mean_absolute_error: 0.1309
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0356 - mean_absolute_error: 0.1119 - val_loss: 0.0429 - val_mean_absolute_error: 0.1288
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0315 - mean_absolute_error: 0.1137 - val_loss: 0.0351 - val_mean_absolute_error: 0.1132
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0354 - mean_absolute_error: 0.1149 - val_loss: 0.0333 - val_mean_absolute_error: 0.1149
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0308 - mean_absolute_error: 0.1094 - val_loss: 0.0342 - val_mean_absolute_error: 0.1147
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0246 - mean_absolute_error: 0.0872 - val_loss: 0.0479 - val_mean_absolute_error: 0.1367
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0420 - mean_absolute_error: 0.1368 - val_loss: 0.0330 - val_mean_absolute_error: 0.1258
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0324 - mean_absolute_error: 0.1193 - val_loss: 0.0256 - val_mean_absolute_error: 0.0896
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0293 - mean_absolute_error: 0.0968 - val_loss: 0.0292 - val_mean_absolute_error: 0.0928
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0338 - mean_absolute_error: 0.0923 - val_loss: 0.0413 - val_mean_absolute_error: 0.1283
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0360 - mean_absolute_error: 0.1104 - val_loss: 0.0333 - val_mean_absolute_error: 0.1224
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0439 - mean_absolute_error: 0.1337 - val_loss: 0.0327 - val_mean_absolute_error: 0.1282
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0399 - mean_absolute_error: 0.1151 - val_loss: 0.0448 - val_mean_absolute_error: 0.1570
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0397 - mean_absolute_error: 0.1213 - val_loss: 0.0412 - val_mean_absolute_error: 0.1321
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0695 - mean_absolute_error: 0.1651 - val_loss: 0.0409 - val_mean_absolute_error: 0.1201
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0315 - mean_absolute_error: 0.1090 - val_loss: 0.0547 - val_mean_absolute_error: 0.1415
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0250 - mean_absolute_error: 0.0913 - val_loss: 0.0586 - val_mean_absolute_error: 0.1606
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0471 - mean_absolute_error: 0.1317 - val_loss: 0.0616 - val_mean_absolute_error: 0.1691
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0466 - mean_absolute_error: 0.1399 - val_loss: 0.0685 - val_mean_absolute_error: 0.1787
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0481 - mean_absolute_error: 0.1386 - val_loss: 0.0467 - val_mean_absolute_error: 0.1556
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0529 - mean_absolute_error: 0.1497 - val_loss: 0.0352 - val_mean_absolute_error: 0.1129

FOLD 4
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - loss: 0.0419 - mean_absolute_error: 0.1291 - val_loss: 0.0191 - val_mean_absolute_error: 0.0715
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0347 - mean_absolute_error: 0.1182 - val_loss: 0.0185 - val_mean_absolute_error: 0.0772
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0514 - mean_absolute_error: 0.1411 - val_loss: 0.0174 - val_mean_absolute_error: 0.0759
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0395 - mean_absolute_error: 0.1366 - val_loss: 0.0220 - val_mean_absolute_error: 0.0862
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0353 - mean_absolute_error: 0.1179 - val_loss: 0.0145 - val_mean_absolute_error: 0.0579
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0450 - mean_absolute_error: 0.1395 - val_loss: 0.0175 - val_mean_absolute_error: 0.0683
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0458 - mean_absolute_error: 0.1470 - val_loss: 0.0304 - val_mean_absolute_error: 0.1105
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0550 - mean_absolute_error: 0.1656 - val_loss: 0.0349 - val_mean_absolute_error: 0.1263
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0431 - mean_absolute_error: 0.1346 - val_loss: 0.0296 - val_mean_absolute_error: 0.1189
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0349 - mean_absolute_error: 0.1242 - val_loss: 0.0392 - val_mean_absolute_error: 0.1315
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0432 - mean_absolute_error: 0.1357 - val_loss: 0.0239 - val_mean_absolute_error: 0.1013
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0336 - mean_absolute_error: 0.1103 - val_loss: 0.0248 - val_mean_absolute_error: 0.0976
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0412 - mean_absolute_error: 0.1412 - val_loss: 0.0390 - val_mean_absolute_error: 0.1430
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0664 - mean_absolute_error: 0.1702 - val_loss: 0.0320 - val_mean_absolute_error: 0.1133
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0508 - mean_absolute_error: 0.1522 - val_loss: 0.0257 - val_mean_absolute_error: 0.1008
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0473 - mean_absolute_error: 0.1368 - val_loss: 0.0343 - val_mean_absolute_error: 0.1220
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0503 - mean_absolute_error: 0.1488 - val_loss: 0.0191 - val_mean_absolute_error: 0.0802
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0369 - mean_absolute_error: 0.1187 - val_loss: 0.0248 - val_mean_absolute_error: 0.0932
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0395 - mean_absolute_error: 0.1334 - val_loss: 0.0260 - val_mean_absolute_error: 0.1012
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0426 - mean_absolute_error: 0.1250 - val_loss: 0.0212 - val_mean_absolute_error: 0.0815
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0349 - mean_absolute_error: 0.1196 - val_loss: 0.0275 - val_mean_absolute_error: 0.1088
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0399 - mean_absolute_error: 0.1334 - val_loss: 0.0271 - val_mean_absolute_error: 0.1060
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0269 - mean_absolute_error: 0.1055 - val_loss: 0.0303 - val_mean_absolute_error: 0.1250
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0344 - mean_absolute_error: 0.1320 - val_loss: 0.0281 - val_mean_absolute_error: 0.1000
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0408 - mean_absolute_error: 0.1220 - val_loss: 0.0244 - val_mean_absolute_error: 0.0966
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0540 - mean_absolute_error: 0.1559 - val_loss: 0.0309 - val_mean_absolute_error: 0.1215
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0452 - mean_absolute_error: 0.1523 - val_loss: 0.0291 - val_mean_absolute_error: 0.1064
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0546 - mean_absolute_error: 0.1506 - val_loss: 0.0222 - val_mean_absolute_error: 0.0973
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0365 - mean_absolute_error: 0.1317 - val_loss: 0.0272 - val_mean_absolute_error: 0.1133
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0281 - mean_absolute_error: 0.1030 - val_loss: 0.0393 - val_mean_absolute_error: 0.1393
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0493 - mean_absolute_error: 0.1437 - val_loss: 0.0238 - val_mean_absolute_error: 0.0923
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0296 - mean_absolute_error: 0.1121 - val_loss: 0.0243 - val_mean_absolute_error: 0.0922
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0305 - mean_absolute_error: 0.1131 - val_loss: 0.0233 - val_mean_absolute_error: 0.1024
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0246 - mean_absolute_error: 0.0910 - val_loss: 0.0257 - val_mean_absolute_error: 0.1126
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0294 - mean_absolute_error: 0.1122 - val_loss: 0.0259 - val_mean_absolute_error: 0.1128
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0357 - mean_absolute_error: 0.1181 - val_loss: 0.0316 - val_mean_absolute_error: 0.1123
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0444 - mean_absolute_error: 0.1397 - val_loss: 0.0330 - val_mean_absolute_error: 0.1291
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0544 - mean_absolute_error: 0.1517 - val_loss: 0.0347 - val_mean_absolute_error: 0.1308
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0483 - mean_absolute_error: 0.1519 - val_loss: 0.0209 - val_mean_absolute_error: 0.0897
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0279 - mean_absolute_error: 0.0983 - val_loss: 0.0270 - val_mean_absolute_error: 0.1127
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0331 - mean_absolute_error: 0.1146 - val_loss: 0.0190 - val_mean_absolute_error: 0.0752
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0360 - mean_absolute_error: 0.1223 - val_loss: 0.0195 - val_mean_absolute_error: 0.0780
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0248 - mean_absolute_error: 0.1016 - val_loss: 0.0332 - val_mean_absolute_error: 0.1141
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0458 - mean_absolute_error: 0.1463 - val_loss: 0.0284 - val_mean_absolute_error: 0.1108
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0326 - mean_absolute_error: 0.1227 - val_loss: 0.0254 - val_mean_absolute_error: 0.0972
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0261 - mean_absolute_error: 0.1000 - val_loss: 0.0366 - val_mean_absolute_error: 0.1327
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0493 - mean_absolute_error: 0.1349 - val_loss: 0.0322 - val_mean_absolute_error: 0.1114
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0474 - mean_absolute_error: 0.1451 - val_loss: 0.0319 - val_mean_absolute_error: 0.1286
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0400 - mean_absolute_error: 0.1098 - val_loss: 0.0316 - val_mean_absolute_error: 0.1292
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0380 - mean_absolute_error: 0.1100 - val_loss: 0.0372 - val_mean_absolute_error: 0.1477

FOLD 5
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - loss: 0.0368 - mean_absolute_error: 0.1323 - val_loss: 0.0143 - val_mean_absolute_error: 0.0577
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0253 - mean_absolute_error: 0.1010 - val_loss: 0.0261 - val_mean_absolute_error: 0.1105
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0310 - mean_absolute_error: 0.1129 - val_loss: 0.0183 - val_mean_absolute_error: 0.0837
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0255 - mean_absolute_error: 0.0899 - val_loss: 0.0174 - val_mean_absolute_error: 0.0737
Epoch 5/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.0336 - mean_absolute_error: 0.1054
2025-08-09 17:03:52.118398: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:03:52.118711: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0297 - mean_absolute_error: 0.1006 - val_loss: 0.0318 - val_mean_absolute_error: 0.1249
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0290 - mean_absolute_error: 0.1007 - val_loss: 0.0158 - val_mean_absolute_error: 0.0692
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0245 - mean_absolute_error: 0.0983 - val_loss: 0.0139 - val_mean_absolute_error: 0.0575
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0232 - mean_absolute_error: 0.0906 - val_loss: 0.0200 - val_mean_absolute_error: 0.0887
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0247 - mean_absolute_error: 0.0997 - val_loss: 0.0188 - val_mean_absolute_error: 0.0776
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0292 - mean_absolute_error: 0.1064 - val_loss: 0.0176 - val_mean_absolute_error: 0.0725
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0252 - mean_absolute_error: 0.0976 - val_loss: 0.0257 - val_mean_absolute_error: 0.1030
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0225 - mean_absolute_error: 0.0884 - val_loss: 0.0238 - val_mean_absolute_error: 0.1050
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0301 - mean_absolute_error: 0.0979 - val_loss: 0.0143 - val_mean_absolute_error: 0.0645
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0428 - mean_absolute_error: 0.1163 - val_loss: 0.0311 - val_mean_absolute_error: 0.1284
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0296 - mean_absolute_error: 0.0988 - val_loss: 0.0166 - val_mean_absolute_error: 0.0732
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0240 - mean_absolute_error: 0.0966 - val_loss: 0.0140 - val_mean_absolute_error: 0.0587
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0243 - mean_absolute_error: 0.0960 - val_loss: 0.0217 - val_mean_absolute_error: 0.0975
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0244 - mean_absolute_error: 0.0918 - val_loss: 0.0233 - val_mean_absolute_error: 0.1051
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0250 - mean_absolute_error: 0.0953 - val_loss: 0.0252 - val_mean_absolute_error: 0.1093
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0315 - mean_absolute_error: 0.1095 - val_loss: 0.0249 - val_mean_absolute_error: 0.0993
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0286 - mean_absolute_error: 0.1023 - val_loss: 0.0156 - val_mean_absolute_error: 0.0705
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0212 - mean_absolute_error: 0.0844 - val_loss: 0.0331 - val_mean_absolute_error: 0.1173
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0307 - mean_absolute_error: 0.1095 - val_loss: 0.0297 - val_mean_absolute_error: 0.1229
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0358 - mean_absolute_error: 0.1115 - val_loss: 0.0189 - val_mean_absolute_error: 0.0862
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0439 - mean_absolute_error: 0.1194 - val_loss: 0.0289 - val_mean_absolute_error: 0.1139
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0298 - mean_absolute_error: 0.1146 - val_loss: 0.0140 - val_mean_absolute_error: 0.0642
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0243 - mean_absolute_error: 0.1013 - val_loss: 0.0154 - val_mean_absolute_error: 0.0691
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0251 - mean_absolute_error: 0.1009 - val_loss: 0.0481 - val_mean_absolute_error: 0.1712
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0396 - mean_absolute_error: 0.1311 - val_loss: 0.0214 - val_mean_absolute_error: 0.0958
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0252 - mean_absolute_error: 0.0916 - val_loss: 0.0214 - val_mean_absolute_error: 0.0913
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0285 - mean_absolute_error: 0.1135 - val_loss: 0.0231 - val_mean_absolute_error: 0.0995
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0216 - mean_absolute_error: 0.0900 - val_loss: 0.0220 - val_mean_absolute_error: 0.0920
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0206 - mean_absolute_error: 0.0857 - val_loss: 0.0172 - val_mean_absolute_error: 0.0817
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0264 - mean_absolute_error: 0.0946 - val_loss: 0.0328 - val_mean_absolute_error: 0.1263
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0322 - mean_absolute_error: 0.1073 - val_loss: 0.0136 - val_mean_absolute_error: 0.0660
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0227 - mean_absolute_error: 0.0928 - val_loss: 0.0149 - val_mean_absolute_error: 0.0702
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0206 - mean_absolute_error: 0.0746 - val_loss: 0.0283 - val_mean_absolute_error: 0.1196
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0248 - mean_absolute_error: 0.0961 - val_loss: 0.0193 - val_mean_absolute_error: 0.0891
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0243 - mean_absolute_error: 0.0903 - val_loss: 0.0174 - val_mean_absolute_error: 0.0822
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0207 - mean_absolute_error: 0.0785 - val_loss: 0.0213 - val_mean_absolute_error: 0.1012
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0171 - mean_absolute_error: 0.0757 - val_loss: 0.0162 - val_mean_absolute_error: 0.0771
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0205 - mean_absolute_error: 0.0850 - val_loss: 0.0361 - val_mean_absolute_error: 0.1333
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0250 - mean_absolute_error: 0.1020 - val_loss: 0.0154 - val_mean_absolute_error: 0.0771
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0241 - mean_absolute_error: 0.0923 - val_loss: 0.0253 - val_mean_absolute_error: 0.1051
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0187 - mean_absolute_error: 0.0757 - val_loss: 0.0274 - val_mean_absolute_error: 0.1108
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0188 - mean_absolute_error: 0.0805 - val_loss: 0.0153 - val_mean_absolute_error: 0.0714
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0227 - mean_absolute_error: 0.0894 - val_loss: 0.0237 - val_mean_absolute_error: 0.1077
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0239 - mean_absolute_error: 0.0966 - val_loss: 0.0228 - val_mean_absolute_error: 0.1021
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0199 - mean_absolute_error: 0.0852 - val_loss: 0.0287 - val_mean_absolute_error: 0.1139
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0328 - mean_absolute_error: 0.1130 - val_loss: 0.0222 - val_mean_absolute_error: 0.0989
Validation losses: [0.22580662369728088, 0.06659550219774246, 0.03523147478699684, 0.037208061665296555, 0.02223942056298256]
HPS: {'player_emb_dim': 8, 'dense_units': 112, 'dense_units_2': 96, 'learning_rate': 0.01, 'dropout_rate': 0.4}. MSE during RandomSearch: 0.21570256352424622. Starting evaluation across all k folds...

FOLD 1
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 2s 88ms/step - loss: 1.1634 - mean_absolute_error: 0.8640 - val_loss: 1.1779 - val_mean_absolute_error: 0.8520
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 1.0524 - mean_absolute_error: 0.7991 - val_loss: 1.5931 - val_mean_absolute_error: 1.0357
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.9252 - mean_absolute_error: 0.7575 - val_loss: 1.4223 - val_mean_absolute_error: 0.9485
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.7880 - mean_absolute_error: 0.6872 - val_loss: 1.2354 - val_mean_absolute_error: 0.8519
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.5475 - mean_absolute_error: 0.5817 - val_loss: 1.4669 - val_mean_absolute_error: 0.9505
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.9747 - mean_absolute_error: 0.7925 - val_loss: 1.1830 - val_mean_absolute_error: 0.8668
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.7720 - mean_absolute_error: 0.7030 - val_loss: 0.9636 - val_mean_absolute_error: 0.7793
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.3947 - mean_absolute_error: 0.4850 - val_loss: 0.8813 - val_mean_absolute_error: 0.7429
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.3745 - mean_absolute_error: 0.4789 - val_loss: 0.6903 - val_mean_absolute_error: 0.6517
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.3032 - mean_absolute_error: 0.4279 - val_loss: 0.5914 - val_mean_absolute_error: 0.5905
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.3109 - mean_absolute_error: 0.4143 - val_loss: 0.5715 - val_mean_absolute_error: 0.5829
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.3667 - mean_absolute_error: 0.4393 - val_loss: 0.6201 - val_mean_absolute_error: 0.6370
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.3096 - mean_absolute_error: 0.4524 - val_loss: 0.6020 - val_mean_absolute_error: 0.6108
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.3392 - mean_absolute_error: 0.4315 - val_loss: 0.5434 - val_mean_absolute_error: 0.5753
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.3036 - mean_absolute_error: 0.4149 - val_loss: 0.3514 - val_mean_absolute_error: 0.4897
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.2740 - mean_absolute_error: 0.3915 - val_loss: 0.2464 - val_mean_absolute_error: 0.3999
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.2862 - mean_absolute_error: 0.3999 - val_loss: 0.3486 - val_mean_absolute_error: 0.4780
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.2031 - mean_absolute_error: 0.3481 - val_loss: 0.3977 - val_mean_absolute_error: 0.5134
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.2448 - mean_absolute_error: 0.3427 - val_loss: 0.3635 - val_mean_absolute_error: 0.4888
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1715 - mean_absolute_error: 0.2991 - val_loss: 0.3334 - val_mean_absolute_error: 0.4679
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.2029 - mean_absolute_error: 0.3307 - val_loss: 0.3583 - val_mean_absolute_error: 0.4872
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1780 - mean_absolute_error: 0.3212 - val_loss: 0.3107 - val_mean_absolute_error: 0.4473
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1884 - mean_absolute_error: 0.3316 - val_loss: 0.3009 - val_mean_absolute_error: 0.4486
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1437 - mean_absolute_error: 0.2829 - val_loss: 0.2840 - val_mean_absolute_error: 0.4334
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1451 - mean_absolute_error: 0.2700 - val_loss: 0.2008 - val_mean_absolute_error: 0.3545
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1532 - mean_absolute_error: 0.2785 - val_loss: 0.1854 - val_mean_absolute_error: 0.3431
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1370 - mean_absolute_error: 0.2844 - val_loss: 0.2264 - val_mean_absolute_error: 0.3803
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1721 - mean_absolute_error: 0.2892 - val_loss: 0.2848 - val_mean_absolute_error: 0.4225
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1508 - mean_absolute_error: 0.2659 - val_loss: 0.2316 - val_mean_absolute_error: 0.3987
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1324 - mean_absolute_error: 0.2607 - val_loss: 0.1842 - val_mean_absolute_error: 0.3470
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1469 - mean_absolute_error: 0.2320 - val_loss: 0.2930 - val_mean_absolute_error: 0.4145
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1728 - mean_absolute_error: 0.3100 - val_loss: 0.3640 - val_mean_absolute_error: 0.4726
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1834 - mean_absolute_error: 0.2980 - val_loss: 0.2401 - val_mean_absolute_error: 0.4252
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1654 - mean_absolute_error: 0.2934 - val_loss: 0.1793 - val_mean_absolute_error: 0.3578
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1862 - mean_absolute_error: 0.3229 - val_loss: 0.2705 - val_mean_absolute_error: 0.4108
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1464 - mean_absolute_error: 0.2726 - val_loss: 0.2905 - val_mean_absolute_error: 0.4285
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1189 - mean_absolute_error: 0.2539 - val_loss: 0.1986 - val_mean_absolute_error: 0.3676
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1722 - mean_absolute_error: 0.2871 - val_loss: 0.1920 - val_mean_absolute_error: 0.3567
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1703 - mean_absolute_error: 0.2852 - val_loss: 0.2215 - val_mean_absolute_error: 0.3825
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1225 - mean_absolute_error: 0.2495 - val_loss: 0.1953 - val_mean_absolute_error: 0.3494
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1271 - mean_absolute_error: 0.2625 - val_loss: 0.1772 - val_mean_absolute_error: 0.3285
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1417 - mean_absolute_error: 0.2669 - val_loss: 0.2294 - val_mean_absolute_error: 0.3856
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1165 - mean_absolute_error: 0.2354 - val_loss: 0.3085 - val_mean_absolute_error: 0.4312
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1789 - mean_absolute_error: 0.2891 - val_loss: 0.2224 - val_mean_absolute_error: 0.3705
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 79ms/step - loss: 0.1306 - mean_absolute_error: 0.2565 - val_loss: 0.1619 - val_mean_absolute_error: 0.3119
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1758 - mean_absolute_error: 0.2652 - val_loss: 0.2176 - val_mean_absolute_error: 0.3839
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1163 - mean_absolute_error: 0.2526 - val_loss: 0.2373 - val_mean_absolute_error: 0.3902
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1868 - mean_absolute_error: 0.2682 - val_loss: 0.2000 - val_mean_absolute_error: 0.3615
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1558 - mean_absolute_error: 0.2731 - val_loss: 0.2497 - val_mean_absolute_error: 0.3902
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1506 - mean_absolute_error: 0.2802 - val_loss: 0.2552 - val_mean_absolute_error: 0.4023

FOLD 2
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.2139 - mean_absolute_error: 0.3532 - val_loss: 0.0883 - val_mean_absolute_error: 0.2121
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1859 - mean_absolute_error: 0.3161 - val_loss: 0.0714 - val_mean_absolute_error: 0.1681
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0913 - mean_absolute_error: 0.2177 - val_loss: 0.0999 - val_mean_absolute_error: 0.1997
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1773 - mean_absolute_error: 0.2824 - val_loss: 0.1396 - val_mean_absolute_error: 0.2571
Epoch 5/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.1396 - mean_absolute_error: 0.2506
2025-08-09 17:03:57.953671: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:03:57.953987: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1487 - mean_absolute_error: 0.2542 - val_loss: 0.0995 - val_mean_absolute_error: 0.2180
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1409 - mean_absolute_error: 0.2767 - val_loss: 0.0902 - val_mean_absolute_error: 0.2331
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.2068 - mean_absolute_error: 0.3310 - val_loss: 0.0839 - val_mean_absolute_error: 0.2171
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1699 - mean_absolute_error: 0.2924 - val_loss: 0.1097 - val_mean_absolute_error: 0.2222
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1262 - mean_absolute_error: 0.2538 - val_loss: 0.1290 - val_mean_absolute_error: 0.2444
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1228 - mean_absolute_error: 0.2445 - val_loss: 0.0966 - val_mean_absolute_error: 0.2035
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1019 - mean_absolute_error: 0.2313 - val_loss: 0.1098 - val_mean_absolute_error: 0.2274
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1293 - mean_absolute_error: 0.2681 - val_loss: 0.1296 - val_mean_absolute_error: 0.2291
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1179 - mean_absolute_error: 0.2432 - val_loss: 0.1686 - val_mean_absolute_error: 0.2888
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1407 - mean_absolute_error: 0.2582 - val_loss: 0.1012 - val_mean_absolute_error: 0.2017
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1092 - mean_absolute_error: 0.2335 - val_loss: 0.0902 - val_mean_absolute_error: 0.1965
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0919 - mean_absolute_error: 0.2142 - val_loss: 0.1230 - val_mean_absolute_error: 0.2475
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1075 - mean_absolute_error: 0.2356 - val_loss: 0.1309 - val_mean_absolute_error: 0.2471
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0992 - mean_absolute_error: 0.2147 - val_loss: 0.1086 - val_mean_absolute_error: 0.2198
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0875 - mean_absolute_error: 0.2073 - val_loss: 0.1179 - val_mean_absolute_error: 0.2319
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1602 - mean_absolute_error: 0.2649 - val_loss: 0.0869 - val_mean_absolute_error: 0.2042
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0956 - mean_absolute_error: 0.2276 - val_loss: 0.0974 - val_mean_absolute_error: 0.1948
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0974 - mean_absolute_error: 0.2171 - val_loss: 0.0973 - val_mean_absolute_error: 0.2276
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1033 - mean_absolute_error: 0.2146 - val_loss: 0.0952 - val_mean_absolute_error: 0.2210
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0993 - mean_absolute_error: 0.2145 - val_loss: 0.1125 - val_mean_absolute_error: 0.2204
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1082 - mean_absolute_error: 0.2159 - val_loss: 0.1292 - val_mean_absolute_error: 0.2377
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0886 - mean_absolute_error: 0.2017 - val_loss: 0.1190 - val_mean_absolute_error: 0.2457
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0894 - mean_absolute_error: 0.2017 - val_loss: 0.1171 - val_mean_absolute_error: 0.2306
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0748 - mean_absolute_error: 0.1901 - val_loss: 0.1058 - val_mean_absolute_error: 0.2119
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0756 - mean_absolute_error: 0.1857 - val_loss: 0.0965 - val_mean_absolute_error: 0.2342
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0936 - mean_absolute_error: 0.2102 - val_loss: 0.0847 - val_mean_absolute_error: 0.2023
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0808 - mean_absolute_error: 0.1827 - val_loss: 0.1014 - val_mean_absolute_error: 0.2095
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0693 - mean_absolute_error: 0.1793 - val_loss: 0.0917 - val_mean_absolute_error: 0.2109
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0960 - mean_absolute_error: 0.2035 - val_loss: 0.1107 - val_mean_absolute_error: 0.2280
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0983 - mean_absolute_error: 0.2112 - val_loss: 0.1426 - val_mean_absolute_error: 0.2634
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0815 - mean_absolute_error: 0.1892 - val_loss: 0.1722 - val_mean_absolute_error: 0.2980
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0870 - mean_absolute_error: 0.2044 - val_loss: 0.1355 - val_mean_absolute_error: 0.2660
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1060 - mean_absolute_error: 0.2226 - val_loss: 0.1308 - val_mean_absolute_error: 0.2359
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0754 - mean_absolute_error: 0.1956 - val_loss: 0.1535 - val_mean_absolute_error: 0.2595
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0876 - mean_absolute_error: 0.2012 - val_loss: 0.1313 - val_mean_absolute_error: 0.2381
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0805 - mean_absolute_error: 0.1961 - val_loss: 0.1210 - val_mean_absolute_error: 0.2418
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0899 - mean_absolute_error: 0.1984 - val_loss: 0.1110 - val_mean_absolute_error: 0.2015
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1080 - mean_absolute_error: 0.2345 - val_loss: 0.1114 - val_mean_absolute_error: 0.2110
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0953 - mean_absolute_error: 0.2067 - val_loss: 0.1142 - val_mean_absolute_error: 0.2220
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0707 - mean_absolute_error: 0.1813 - val_loss: 0.0983 - val_mean_absolute_error: 0.1972
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0972 - mean_absolute_error: 0.2242 - val_loss: 0.1096 - val_mean_absolute_error: 0.2351
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0945 - mean_absolute_error: 0.1937 - val_loss: 0.1128 - val_mean_absolute_error: 0.2272
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0666 - mean_absolute_error: 0.1624 - val_loss: 0.1101 - val_mean_absolute_error: 0.2233
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0807 - mean_absolute_error: 0.1902 - val_loss: 0.1124 - val_mean_absolute_error: 0.2270
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0644 - mean_absolute_error: 0.1742 - val_loss: 0.1451 - val_mean_absolute_error: 0.2756
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0988 - mean_absolute_error: 0.1963 - val_loss: 0.1537 - val_mean_absolute_error: 0.2668

FOLD 3
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - loss: 0.1308 - mean_absolute_error: 0.2615 - val_loss: 0.0295 - val_mean_absolute_error: 0.0931
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1409 - mean_absolute_error: 0.2589 - val_loss: 0.0257 - val_mean_absolute_error: 0.0864
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0940 - mean_absolute_error: 0.2199 - val_loss: 0.0810 - val_mean_absolute_error: 0.1814
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1977 - mean_absolute_error: 0.2557 - val_loss: 0.1145 - val_mean_absolute_error: 0.2657
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1607 - mean_absolute_error: 0.3172 - val_loss: 0.1477 - val_mean_absolute_error: 0.2944
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1864 - mean_absolute_error: 0.3228 - val_loss: 0.0704 - val_mean_absolute_error: 0.1790
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1277 - mean_absolute_error: 0.2599 - val_loss: 0.0553 - val_mean_absolute_error: 0.1651
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1210 - mean_absolute_error: 0.2592 - val_loss: 0.0641 - val_mean_absolute_error: 0.1861
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0666 - mean_absolute_error: 0.1942 - val_loss: 0.0756 - val_mean_absolute_error: 0.1948
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1484 - mean_absolute_error: 0.2578 - val_loss: 0.0573 - val_mean_absolute_error: 0.1740
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1234 - mean_absolute_error: 0.2324 - val_loss: 0.0531 - val_mean_absolute_error: 0.1691
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1370 - mean_absolute_error: 0.2614 - val_loss: 0.0989 - val_mean_absolute_error: 0.2195
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1013 - mean_absolute_error: 0.2044 - val_loss: 0.1087 - val_mean_absolute_error: 0.2217
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1376 - mean_absolute_error: 0.2547 - val_loss: 0.0626 - val_mean_absolute_error: 0.1738
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0966 - mean_absolute_error: 0.2254 - val_loss: 0.0682 - val_mean_absolute_error: 0.1955
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1166 - mean_absolute_error: 0.2349 - val_loss: 0.0734 - val_mean_absolute_error: 0.1992
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0653 - mean_absolute_error: 0.1703 - val_loss: 0.0894 - val_mean_absolute_error: 0.2048
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0888 - mean_absolute_error: 0.1991 - val_loss: 0.1091 - val_mean_absolute_error: 0.2384
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1078 - mean_absolute_error: 0.2427 - val_loss: 0.1245 - val_mean_absolute_error: 0.2487
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1211 - mean_absolute_error: 0.2497 - val_loss: 0.0552 - val_mean_absolute_error: 0.1658
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1097 - mean_absolute_error: 0.2273 - val_loss: 0.0579 - val_mean_absolute_error: 0.1633
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0946 - mean_absolute_error: 0.2122 - val_loss: 0.0560 - val_mean_absolute_error: 0.1687
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0948 - mean_absolute_error: 0.2064 - val_loss: 0.1127 - val_mean_absolute_error: 0.2397
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1217 - mean_absolute_error: 0.2499 - val_loss: 0.1204 - val_mean_absolute_error: 0.2398
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0851 - mean_absolute_error: 0.2032 - val_loss: 0.0834 - val_mean_absolute_error: 0.1963
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0939 - mean_absolute_error: 0.2119 - val_loss: 0.0629 - val_mean_absolute_error: 0.1707
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0694 - mean_absolute_error: 0.1785 - val_loss: 0.0745 - val_mean_absolute_error: 0.1917
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0882 - mean_absolute_error: 0.2120 - val_loss: 0.0702 - val_mean_absolute_error: 0.1896
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0982 - mean_absolute_error: 0.2255 - val_loss: 0.0577 - val_mean_absolute_error: 0.1646
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0935 - mean_absolute_error: 0.2113 - val_loss: 0.0467 - val_mean_absolute_error: 0.1521
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0854 - mean_absolute_error: 0.2052 - val_loss: 0.0703 - val_mean_absolute_error: 0.1859
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0649 - mean_absolute_error: 0.1669 - val_loss: 0.0807 - val_mean_absolute_error: 0.2034
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1077 - mean_absolute_error: 0.2182 - val_loss: 0.1005 - val_mean_absolute_error: 0.2232
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1008 - mean_absolute_error: 0.2196 - val_loss: 0.1178 - val_mean_absolute_error: 0.2456
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0743 - mean_absolute_error: 0.2066 - val_loss: 0.0705 - val_mean_absolute_error: 0.1909
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1242 - mean_absolute_error: 0.2518 - val_loss: 0.0594 - val_mean_absolute_error: 0.1748
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1242 - mean_absolute_error: 0.2446 - val_loss: 0.0532 - val_mean_absolute_error: 0.1665
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1578 - mean_absolute_error: 0.2655 - val_loss: 0.0825 - val_mean_absolute_error: 0.2090
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1010 - mean_absolute_error: 0.2183 - val_loss: 0.0782 - val_mean_absolute_error: 0.1937
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0910 - mean_absolute_error: 0.2067 - val_loss: 0.0646 - val_mean_absolute_error: 0.1727
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0984 - mean_absolute_error: 0.2219 - val_loss: 0.0858 - val_mean_absolute_error: 0.2126
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0937 - mean_absolute_error: 0.2257 - val_loss: 0.0823 - val_mean_absolute_error: 0.2017
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0893 - mean_absolute_error: 0.2030 - val_loss: 0.0612 - val_mean_absolute_error: 0.1734
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0799 - mean_absolute_error: 0.1876 - val_loss: 0.0690 - val_mean_absolute_error: 0.1810
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0759 - mean_absolute_error: 0.1758 - val_loss: 0.0634 - val_mean_absolute_error: 0.1748
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0729 - mean_absolute_error: 0.2009 - val_loss: 0.0664 - val_mean_absolute_error: 0.1867
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0912 - mean_absolute_error: 0.1968 - val_loss: 0.1311 - val_mean_absolute_error: 0.2693
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1111 - mean_absolute_error: 0.2434 - val_loss: 0.1397 - val_mean_absolute_error: 0.2725
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1269 - mean_absolute_error: 0.2402 - val_loss: 0.0753 - val_mean_absolute_error: 0.2057
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0866 - mean_absolute_error: 0.1981 - val_loss: 0.0618 - val_mean_absolute_error: 0.1866

FOLD 4
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.1085 - mean_absolute_error: 0.2288 - val_loss: 0.0290 - val_mean_absolute_error: 0.1065
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0779 - mean_absolute_error: 0.1971 - val_loss: 0.0319 - val_mean_absolute_error: 0.1030
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1253 - mean_absolute_error: 0.2325 - val_loss: 0.0468 - val_mean_absolute_error: 0.1574
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1460 - mean_absolute_error: 0.2624 - val_loss: 0.0279 - val_mean_absolute_error: 0.1002
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1035 - mean_absolute_error: 0.2298 - val_loss: 0.0537 - val_mean_absolute_error: 0.1661
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1789 - mean_absolute_error: 0.3033 - val_loss: 0.0325 - val_mean_absolute_error: 0.1128
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0761 - mean_absolute_error: 0.2069 - val_loss: 0.0556 - val_mean_absolute_error: 0.1862
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1443 - mean_absolute_error: 0.2724 - val_loss: 0.0293 - val_mean_absolute_error: 0.1107
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0774 - mean_absolute_error: 0.1986 - val_loss: 0.0386 - val_mean_absolute_error: 0.1257
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1453 - mean_absolute_error: 0.2522 - val_loss: 0.0416 - val_mean_absolute_error: 0.1298
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0924 - mean_absolute_error: 0.2184 - val_loss: 0.0421 - val_mean_absolute_error: 0.1400
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0948 - mean_absolute_error: 0.2176 - val_loss: 0.0555 - val_mean_absolute_error: 0.1763
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1068 - mean_absolute_error: 0.2377 - val_loss: 0.0485 - val_mean_absolute_error: 0.1645
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1298 - mean_absolute_error: 0.2552 - val_loss: 0.0626 - val_mean_absolute_error: 0.1866
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1181 - mean_absolute_error: 0.2529 - val_loss: 0.0494 - val_mean_absolute_error: 0.1484
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0932 - mean_absolute_error: 0.2073 - val_loss: 0.0556 - val_mean_absolute_error: 0.1759
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1311 - mean_absolute_error: 0.2618 - val_loss: 0.0336 - val_mean_absolute_error: 0.1233
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0887 - mean_absolute_error: 0.1990 - val_loss: 0.0434 - val_mean_absolute_error: 0.1374
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1274 - mean_absolute_error: 0.2379 - val_loss: 0.0568 - val_mean_absolute_error: 0.1832
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1124 - mean_absolute_error: 0.2623 - val_loss: 0.0344 - val_mean_absolute_error: 0.1237
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1143 - mean_absolute_error: 0.2420 - val_loss: 0.0495 - val_mean_absolute_error: 0.1420
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1336 - mean_absolute_error: 0.2487 - val_loss: 0.0370 - val_mean_absolute_error: 0.1315
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0961 - mean_absolute_error: 0.2155 - val_loss: 0.0319 - val_mean_absolute_error: 0.1091
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1354 - mean_absolute_error: 0.2600 - val_loss: 0.0462 - val_mean_absolute_error: 0.1464
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0910 - mean_absolute_error: 0.2146 - val_loss: 0.0382 - val_mean_absolute_error: 0.1168
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0872 - mean_absolute_error: 0.2014 - val_loss: 0.0269 - val_mean_absolute_error: 0.0984
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1321 - mean_absolute_error: 0.2323 - val_loss: 0.0657 - val_mean_absolute_error: 0.1793
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1056 - mean_absolute_error: 0.2264 - val_loss: 0.0641 - val_mean_absolute_error: 0.1859
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0948 - mean_absolute_error: 0.2276 - val_loss: 0.0411 - val_mean_absolute_error: 0.1437
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0890 - mean_absolute_error: 0.2024 - val_loss: 0.0313 - val_mean_absolute_error: 0.1176
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0550 - mean_absolute_error: 0.1668 - val_loss: 0.0355 - val_mean_absolute_error: 0.1281
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1533 - mean_absolute_error: 0.2565 - val_loss: 0.0466 - val_mean_absolute_error: 0.1489
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1030 - mean_absolute_error: 0.2361 - val_loss: 0.0337 - val_mean_absolute_error: 0.1202
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1179 - mean_absolute_error: 0.2406 - val_loss: 0.0425 - val_mean_absolute_error: 0.1381
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0765 - mean_absolute_error: 0.1967 - val_loss: 0.0283 - val_mean_absolute_error: 0.1091
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1191 - mean_absolute_error: 0.2307 - val_loss: 0.0781 - val_mean_absolute_error: 0.2045
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1461 - mean_absolute_error: 0.2695 - val_loss: 0.0689 - val_mean_absolute_error: 0.1773
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1234 - mean_absolute_error: 0.2514 - val_loss: 0.0836 - val_mean_absolute_error: 0.2452
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1062 - mean_absolute_error: 0.2461 - val_loss: 0.0445 - val_mean_absolute_error: 0.1536
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1392 - mean_absolute_error: 0.2590 - val_loss: 0.0342 - val_mean_absolute_error: 0.1168
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1236 - mean_absolute_error: 0.2423 - val_loss: 0.0705 - val_mean_absolute_error: 0.2005
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1422 - mean_absolute_error: 0.2738 - val_loss: 0.0558 - val_mean_absolute_error: 0.1796
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1712 - mean_absolute_error: 0.2712 - val_loss: 0.0665 - val_mean_absolute_error: 0.1943
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1329 - mean_absolute_error: 0.2705 - val_loss: 0.0574 - val_mean_absolute_error: 0.1862
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1174 - mean_absolute_error: 0.2280 - val_loss: 0.0565 - val_mean_absolute_error: 0.1877
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1094 - mean_absolute_error: 0.2313 - val_loss: 0.0499 - val_mean_absolute_error: 0.1708
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0945 - mean_absolute_error: 0.2243 - val_loss: 0.0560 - val_mean_absolute_error: 0.1737
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0882 - mean_absolute_error: 0.2280 - val_loss: 0.0426 - val_mean_absolute_error: 0.1390
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1032 - mean_absolute_error: 0.2218 - val_loss: 0.0584 - val_mean_absolute_error: 0.1779
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0989 - mean_absolute_error: 0.2022 - val_loss: 0.0670 - val_mean_absolute_error: 0.1970

FOLD 5
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.1042 - mean_absolute_error: 0.2339 - val_loss: 0.0330 - val_mean_absolute_error: 0.1098
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0960 - mean_absolute_error: 0.2291 - val_loss: 0.0282 - val_mean_absolute_error: 0.1008
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0829 - mean_absolute_error: 0.2150 - val_loss: 0.0351 - val_mean_absolute_error: 0.1174
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0820 - mean_absolute_error: 0.1993 - val_loss: 0.0245 - val_mean_absolute_error: 0.0898
Epoch 5/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.0408 - mean_absolute_error: 0.1377
2025-08-09 17:04:03.987444: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:04:03.987771: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0570 - mean_absolute_error: 0.1631 - val_loss: 0.0285 - val_mean_absolute_error: 0.1008
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0807 - mean_absolute_error: 0.1926 - val_loss: 0.0362 - val_mean_absolute_error: 0.1245
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0656 - mean_absolute_error: 0.1784 - val_loss: 0.0498 - val_mean_absolute_error: 0.1483
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0903 - mean_absolute_error: 0.2105 - val_loss: 0.0401 - val_mean_absolute_error: 0.1332
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0790 - mean_absolute_error: 0.1991 - val_loss: 0.0364 - val_mean_absolute_error: 0.1197
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0897 - mean_absolute_error: 0.1908 - val_loss: 0.0443 - val_mean_absolute_error: 0.1540
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0970 - mean_absolute_error: 0.2294 - val_loss: 0.0316 - val_mean_absolute_error: 0.1068
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0663 - mean_absolute_error: 0.1873 - val_loss: 0.0557 - val_mean_absolute_error: 0.1576
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0837 - mean_absolute_error: 0.1988 - val_loss: 0.0273 - val_mean_absolute_error: 0.1040
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0531 - mean_absolute_error: 0.1691 - val_loss: 0.0396 - val_mean_absolute_error: 0.1348
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0538 - mean_absolute_error: 0.1624 - val_loss: 0.0462 - val_mean_absolute_error: 0.1567
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0704 - mean_absolute_error: 0.1792 - val_loss: 0.0270 - val_mean_absolute_error: 0.1018
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0767 - mean_absolute_error: 0.1878 - val_loss: 0.0255 - val_mean_absolute_error: 0.0959
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0469 - mean_absolute_error: 0.1381 - val_loss: 0.0259 - val_mean_absolute_error: 0.0973
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0472 - mean_absolute_error: 0.1533 - val_loss: 0.0413 - val_mean_absolute_error: 0.1455
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0748 - mean_absolute_error: 0.1865 - val_loss: 0.0395 - val_mean_absolute_error: 0.1334
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0536 - mean_absolute_error: 0.1513 - val_loss: 0.0403 - val_mean_absolute_error: 0.1427
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0595 - mean_absolute_error: 0.1734 - val_loss: 0.0267 - val_mean_absolute_error: 0.1000
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0491 - mean_absolute_error: 0.1448 - val_loss: 0.0293 - val_mean_absolute_error: 0.1104
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0763 - mean_absolute_error: 0.1810 - val_loss: 0.0498 - val_mean_absolute_error: 0.1542
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0611 - mean_absolute_error: 0.1732 - val_loss: 0.0363 - val_mean_absolute_error: 0.1270
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0655 - mean_absolute_error: 0.1681 - val_loss: 0.0338 - val_mean_absolute_error: 0.1300
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1105 - mean_absolute_error: 0.2316 - val_loss: 0.0311 - val_mean_absolute_error: 0.1222
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0720 - mean_absolute_error: 0.1957 - val_loss: 0.0388 - val_mean_absolute_error: 0.1431
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0643 - mean_absolute_error: 0.1676 - val_loss: 0.0489 - val_mean_absolute_error: 0.1463
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0788 - mean_absolute_error: 0.2069 - val_loss: 0.0360 - val_mean_absolute_error: 0.1312
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0480 - mean_absolute_error: 0.1421 - val_loss: 0.0314 - val_mean_absolute_error: 0.1228
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0545 - mean_absolute_error: 0.1555 - val_loss: 0.0619 - val_mean_absolute_error: 0.1935
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0647 - mean_absolute_error: 0.1804 - val_loss: 0.0286 - val_mean_absolute_error: 0.1038
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0776 - mean_absolute_error: 0.1811 - val_loss: 0.0384 - val_mean_absolute_error: 0.1326
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0877 - mean_absolute_error: 0.2125 - val_loss: 0.0515 - val_mean_absolute_error: 0.1592
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0627 - mean_absolute_error: 0.1758 - val_loss: 0.0918 - val_mean_absolute_error: 0.2299
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1095 - mean_absolute_error: 0.2101 - val_loss: 0.0675 - val_mean_absolute_error: 0.1888
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1066 - mean_absolute_error: 0.2311 - val_loss: 0.0526 - val_mean_absolute_error: 0.1715
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0672 - mean_absolute_error: 0.1962 - val_loss: 0.0629 - val_mean_absolute_error: 0.1779
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1313 - mean_absolute_error: 0.2295 - val_loss: 0.0373 - val_mean_absolute_error: 0.1275
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0552 - mean_absolute_error: 0.1492 - val_loss: 0.0448 - val_mean_absolute_error: 0.1463
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0772 - mean_absolute_error: 0.1795 - val_loss: 0.0639 - val_mean_absolute_error: 0.1743
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0888 - mean_absolute_error: 0.2028 - val_loss: 0.0436 - val_mean_absolute_error: 0.1394
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1220 - mean_absolute_error: 0.2070 - val_loss: 0.0448 - val_mean_absolute_error: 0.1427
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0879 - mean_absolute_error: 0.1943 - val_loss: 0.0454 - val_mean_absolute_error: 0.1377
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0826 - mean_absolute_error: 0.1781 - val_loss: 0.0349 - val_mean_absolute_error: 0.1207
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0701 - mean_absolute_error: 0.1704 - val_loss: 0.0309 - val_mean_absolute_error: 0.1220
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0786 - mean_absolute_error: 0.1805 - val_loss: 0.0576 - val_mean_absolute_error: 0.1754
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0724 - mean_absolute_error: 0.1787 - val_loss: 0.0355 - val_mean_absolute_error: 0.1259
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0987 - mean_absolute_error: 0.2134 - val_loss: 0.0343 - val_mean_absolute_error: 0.1245
Validation losses: [0.2551981806755066, 0.15366901457309723, 0.06183907389640808, 0.06702043116092682, 0.034276530146598816]
HPS: {'player_emb_dim': 8, 'dense_units': 64, 'dense_units_2': 80, 'learning_rate': 0.01, 'dropout_rate': 0.30000000000000004}. MSE during RandomSearch: 0.13918931782245636. Starting evaluation across all k folds...

FOLD 1
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 2s 89ms/step - loss: 1.2620 - mean_absolute_error: 0.9048 - val_loss: 1.1664 - val_mean_absolute_error: 0.8336
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.9313 - mean_absolute_error: 0.7225 - val_loss: 1.5012 - val_mean_absolute_error: 0.9920
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.9560 - mean_absolute_error: 0.7694 - val_loss: 1.2628 - val_mean_absolute_error: 0.8562
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.8383 - mean_absolute_error: 0.7013 - val_loss: 1.3717 - val_mean_absolute_error: 0.9851
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.6241 - mean_absolute_error: 0.5938 - val_loss: 0.9326 - val_mean_absolute_error: 0.7644
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.4562 - mean_absolute_error: 0.5179 - val_loss: 0.8447 - val_mean_absolute_error: 0.7170
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.4345 - mean_absolute_error: 0.4710 - val_loss: 0.9039 - val_mean_absolute_error: 0.7271
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.3992 - mean_absolute_error: 0.4495 - val_loss: 0.6979 - val_mean_absolute_error: 0.6741
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.3648 - mean_absolute_error: 0.4905 - val_loss: 0.6072 - val_mean_absolute_error: 0.6319
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.3512 - mean_absolute_error: 0.4606 - val_loss: 0.4743 - val_mean_absolute_error: 0.5651
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.2274 - mean_absolute_error: 0.3702 - val_loss: 0.3721 - val_mean_absolute_error: 0.4939
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.2094 - mean_absolute_error: 0.3587 - val_loss: 0.3538 - val_mean_absolute_error: 0.4687
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.2563 - mean_absolute_error: 0.4028 - val_loss: 0.4777 - val_mean_absolute_error: 0.5398
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.2368 - mean_absolute_error: 0.3857 - val_loss: 0.5276 - val_mean_absolute_error: 0.5776
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.2347 - mean_absolute_error: 0.3616 - val_loss: 0.3525 - val_mean_absolute_error: 0.4895
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1881 - mean_absolute_error: 0.3228 - val_loss: 0.3340 - val_mean_absolute_error: 0.4663
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1776 - mean_absolute_error: 0.3071 - val_loss: 0.5277 - val_mean_absolute_error: 0.6002
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1554 - mean_absolute_error: 0.2982 - val_loss: 0.3886 - val_mean_absolute_error: 0.5202
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.2137 - mean_absolute_error: 0.3679 - val_loss: 0.3245 - val_mean_absolute_error: 0.4713
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.2450 - mean_absolute_error: 0.3617 - val_loss: 0.3419 - val_mean_absolute_error: 0.4742
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1478 - mean_absolute_error: 0.2822 - val_loss: 0.3640 - val_mean_absolute_error: 0.5050
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1367 - mean_absolute_error: 0.2919 - val_loss: 0.2224 - val_mean_absolute_error: 0.3932
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1301 - mean_absolute_error: 0.2686 - val_loss: 0.2042 - val_mean_absolute_error: 0.3773
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1695 - mean_absolute_error: 0.3132 - val_loss: 0.2593 - val_mean_absolute_error: 0.4238
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1506 - mean_absolute_error: 0.2716 - val_loss: 0.2232 - val_mean_absolute_error: 0.3908
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.2035 - mean_absolute_error: 0.3280 - val_loss: 0.2048 - val_mean_absolute_error: 0.3659
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1231 - mean_absolute_error: 0.2584 - val_loss: 0.2245 - val_mean_absolute_error: 0.3701
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1534 - mean_absolute_error: 0.2771 - val_loss: 0.2465 - val_mean_absolute_error: 0.3749
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1004 - mean_absolute_error: 0.2072 - val_loss: 0.2120 - val_mean_absolute_error: 0.3501
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1657 - mean_absolute_error: 0.2882 - val_loss: 0.2185 - val_mean_absolute_error: 0.3535
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0997 - mean_absolute_error: 0.2333 - val_loss: 0.2424 - val_mean_absolute_error: 0.3790
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1386 - mean_absolute_error: 0.2613 - val_loss: 0.2045 - val_mean_absolute_error: 0.3453
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1202 - mean_absolute_error: 0.2369 - val_loss: 0.1822 - val_mean_absolute_error: 0.3372
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1081 - mean_absolute_error: 0.2473 - val_loss: 0.2194 - val_mean_absolute_error: 0.3741
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0980 - mean_absolute_error: 0.2300 - val_loss: 0.2551 - val_mean_absolute_error: 0.4058
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0857 - mean_absolute_error: 0.2100 - val_loss: 0.2106 - val_mean_absolute_error: 0.3804
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1284 - mean_absolute_error: 0.2533 - val_loss: 0.2182 - val_mean_absolute_error: 0.3874
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1196 - mean_absolute_error: 0.2389 - val_loss: 0.2346 - val_mean_absolute_error: 0.4071
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0862 - mean_absolute_error: 0.2150 - val_loss: 0.2337 - val_mean_absolute_error: 0.4091
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1231 - mean_absolute_error: 0.2398 - val_loss: 0.1885 - val_mean_absolute_error: 0.3592
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0961 - mean_absolute_error: 0.2184 - val_loss: 0.1956 - val_mean_absolute_error: 0.3551
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0721 - mean_absolute_error: 0.1914 - val_loss: 0.2244 - val_mean_absolute_error: 0.3733
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1019 - mean_absolute_error: 0.2243 - val_loss: 0.2010 - val_mean_absolute_error: 0.3522
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1234 - mean_absolute_error: 0.2536 - val_loss: 0.1796 - val_mean_absolute_error: 0.3459
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0984 - mean_absolute_error: 0.2261 - val_loss: 0.2349 - val_mean_absolute_error: 0.4105
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1039 - mean_absolute_error: 0.2241 - val_loss: 0.2234 - val_mean_absolute_error: 0.3951
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0980 - mean_absolute_error: 0.2241 - val_loss: 0.1984 - val_mean_absolute_error: 0.3541
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0667 - mean_absolute_error: 0.1816 - val_loss: 0.1892 - val_mean_absolute_error: 0.3378
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0763 - mean_absolute_error: 0.1846 - val_loss: 0.1865 - val_mean_absolute_error: 0.3424
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1007 - mean_absolute_error: 0.2282 - val_loss: 0.1928 - val_mean_absolute_error: 0.3577

FOLD 2
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - loss: 0.1246 - mean_absolute_error: 0.2653 - val_loss: 0.0250 - val_mean_absolute_error: 0.0943
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1222 - mean_absolute_error: 0.2498 - val_loss: 0.0416 - val_mean_absolute_error: 0.1186
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1004 - mean_absolute_error: 0.2246 - val_loss: 0.0749 - val_mean_absolute_error: 0.1809
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0992 - mean_absolute_error: 0.2220 - val_loss: 0.0486 - val_mean_absolute_error: 0.1328
Epoch 5/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0808 - mean_absolute_error: 0.2043
2025-08-09 17:04:09.661271: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:04:09.661579: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1259 - mean_absolute_error: 0.2451 - val_loss: 0.0605 - val_mean_absolute_error: 0.1778
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0971 - mean_absolute_error: 0.2165 - val_loss: 0.0597 - val_mean_absolute_error: 0.1536
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0793 - mean_absolute_error: 0.2035 - val_loss: 0.0805 - val_mean_absolute_error: 0.1731
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0938 - mean_absolute_error: 0.2203 - val_loss: 0.1237 - val_mean_absolute_error: 0.2437
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0944 - mean_absolute_error: 0.2238 - val_loss: 0.0990 - val_mean_absolute_error: 0.2005
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0911 - mean_absolute_error: 0.2088 - val_loss: 0.0705 - val_mean_absolute_error: 0.1892
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1144 - mean_absolute_error: 0.2260 - val_loss: 0.0852 - val_mean_absolute_error: 0.2072
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0979 - mean_absolute_error: 0.2047 - val_loss: 0.1499 - val_mean_absolute_error: 0.2900
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1165 - mean_absolute_error: 0.2609 - val_loss: 0.0860 - val_mean_absolute_error: 0.2024
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0894 - mean_absolute_error: 0.2135 - val_loss: 0.0685 - val_mean_absolute_error: 0.2052
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1123 - mean_absolute_error: 0.2405 - val_loss: 0.0757 - val_mean_absolute_error: 0.1915
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0856 - mean_absolute_error: 0.2149 - val_loss: 0.1176 - val_mean_absolute_error: 0.2628
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0798 - mean_absolute_error: 0.1939 - val_loss: 0.0869 - val_mean_absolute_error: 0.2052
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0680 - mean_absolute_error: 0.1821 - val_loss: 0.0728 - val_mean_absolute_error: 0.1762
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0576 - mean_absolute_error: 0.1498 - val_loss: 0.0840 - val_mean_absolute_error: 0.2161
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0896 - mean_absolute_error: 0.2108 - val_loss: 0.0791 - val_mean_absolute_error: 0.1865
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0802 - mean_absolute_error: 0.1968 - val_loss: 0.0955 - val_mean_absolute_error: 0.2018
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0756 - mean_absolute_error: 0.1917 - val_loss: 0.1043 - val_mean_absolute_error: 0.2500
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0968 - mean_absolute_error: 0.2039 - val_loss: 0.0904 - val_mean_absolute_error: 0.1956
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0994 - mean_absolute_error: 0.2215 - val_loss: 0.1101 - val_mean_absolute_error: 0.2309
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0703 - mean_absolute_error: 0.1863 - val_loss: 0.0916 - val_mean_absolute_error: 0.2364
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0648 - mean_absolute_error: 0.1680 - val_loss: 0.1066 - val_mean_absolute_error: 0.2374
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0647 - mean_absolute_error: 0.1675 - val_loss: 0.1192 - val_mean_absolute_error: 0.2285
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0790 - mean_absolute_error: 0.1847 - val_loss: 0.1365 - val_mean_absolute_error: 0.2487
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step - loss: 0.0533 - mean_absolute_error: 0.1621 - val_loss: 0.1237 - val_mean_absolute_error: 0.2363
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0616 - mean_absolute_error: 0.1763 - val_loss: 0.0881 - val_mean_absolute_error: 0.2067
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0899 - mean_absolute_error: 0.2188 - val_loss: 0.0775 - val_mean_absolute_error: 0.2010
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0720 - mean_absolute_error: 0.1901 - val_loss: 0.0979 - val_mean_absolute_error: 0.2234
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0691 - mean_absolute_error: 0.1714 - val_loss: 0.1306 - val_mean_absolute_error: 0.2717
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0967 - mean_absolute_error: 0.2179 - val_loss: 0.0702 - val_mean_absolute_error: 0.1688
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0740 - mean_absolute_error: 0.1895 - val_loss: 0.0747 - val_mean_absolute_error: 0.1916
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0850 - mean_absolute_error: 0.2032 - val_loss: 0.1027 - val_mean_absolute_error: 0.2124
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0729 - mean_absolute_error: 0.1800 - val_loss: 0.1556 - val_mean_absolute_error: 0.2877
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0719 - mean_absolute_error: 0.1994 - val_loss: 0.0862 - val_mean_absolute_error: 0.2077
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1010 - mean_absolute_error: 0.2155 - val_loss: 0.0922 - val_mean_absolute_error: 0.2054
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0692 - mean_absolute_error: 0.1839 - val_loss: 0.1222 - val_mean_absolute_error: 0.2380
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0916 - mean_absolute_error: 0.2012 - val_loss: 0.1096 - val_mean_absolute_error: 0.2178
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0544 - mean_absolute_error: 0.1525 - val_loss: 0.0878 - val_mean_absolute_error: 0.1891
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0785 - mean_absolute_error: 0.1941 - val_loss: 0.0939 - val_mean_absolute_error: 0.2027
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0658 - mean_absolute_error: 0.1812 - val_loss: 0.0839 - val_mean_absolute_error: 0.2034
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0601 - mean_absolute_error: 0.1609 - val_loss: 0.1076 - val_mean_absolute_error: 0.2380
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0992 - mean_absolute_error: 0.2224 - val_loss: 0.1292 - val_mean_absolute_error: 0.2378
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0534 - mean_absolute_error: 0.1698 - val_loss: 0.1310 - val_mean_absolute_error: 0.2519
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0634 - mean_absolute_error: 0.1794 - val_loss: 0.1105 - val_mean_absolute_error: 0.2307
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0849 - mean_absolute_error: 0.2203 - val_loss: 0.0993 - val_mean_absolute_error: 0.2196
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0739 - mean_absolute_error: 0.1916 - val_loss: 0.1292 - val_mean_absolute_error: 0.2874

FOLD 3
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.0818 - mean_absolute_error: 0.2055 - val_loss: 0.0414 - val_mean_absolute_error: 0.1450
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0961 - mean_absolute_error: 0.2187 - val_loss: 0.0339 - val_mean_absolute_error: 0.1324
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0588 - mean_absolute_error: 0.1682 - val_loss: 0.0334 - val_mean_absolute_error: 0.1251
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1105 - mean_absolute_error: 0.2286 - val_loss: 0.0499 - val_mean_absolute_error: 0.1420
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1103 - mean_absolute_error: 0.2413 - val_loss: 0.0558 - val_mean_absolute_error: 0.1537
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1121 - mean_absolute_error: 0.2412 - val_loss: 0.0352 - val_mean_absolute_error: 0.1132
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0829 - mean_absolute_error: 0.1866 - val_loss: 0.0412 - val_mean_absolute_error: 0.1320
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0883 - mean_absolute_error: 0.2030 - val_loss: 0.0401 - val_mean_absolute_error: 0.1375
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0770 - mean_absolute_error: 0.1896 - val_loss: 0.0406 - val_mean_absolute_error: 0.1495
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0793 - mean_absolute_error: 0.1839 - val_loss: 0.0427 - val_mean_absolute_error: 0.1263
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0939 - mean_absolute_error: 0.2005 - val_loss: 0.0466 - val_mean_absolute_error: 0.1386
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0647 - mean_absolute_error: 0.1792 - val_loss: 0.0674 - val_mean_absolute_error: 0.1926
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0819 - mean_absolute_error: 0.1973 - val_loss: 0.0637 - val_mean_absolute_error: 0.1805
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0722 - mean_absolute_error: 0.1777 - val_loss: 0.0605 - val_mean_absolute_error: 0.1735
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1524 - mean_absolute_error: 0.2393 - val_loss: 0.0657 - val_mean_absolute_error: 0.1720
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1109 - mean_absolute_error: 0.2127 - val_loss: 0.1056 - val_mean_absolute_error: 0.2254
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0842 - mean_absolute_error: 0.2004 - val_loss: 0.0786 - val_mean_absolute_error: 0.1927
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0675 - mean_absolute_error: 0.1876 - val_loss: 0.0693 - val_mean_absolute_error: 0.1942
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0655 - mean_absolute_error: 0.1704 - val_loss: 0.0889 - val_mean_absolute_error: 0.2046
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0872 - mean_absolute_error: 0.2086 - val_loss: 0.0993 - val_mean_absolute_error: 0.2225
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1130 - mean_absolute_error: 0.2349 - val_loss: 0.0630 - val_mean_absolute_error: 0.1853
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1203 - mean_absolute_error: 0.2673 - val_loss: 0.0371 - val_mean_absolute_error: 0.1380
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1226 - mean_absolute_error: 0.2383 - val_loss: 0.1025 - val_mean_absolute_error: 0.2330
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0915 - mean_absolute_error: 0.2207 - val_loss: 0.0783 - val_mean_absolute_error: 0.2026
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1228 - mean_absolute_error: 0.2485 - val_loss: 0.0520 - val_mean_absolute_error: 0.1548
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0885 - mean_absolute_error: 0.1951 - val_loss: 0.0656 - val_mean_absolute_error: 0.1866
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1007 - mean_absolute_error: 0.2260 - val_loss: 0.0564 - val_mean_absolute_error: 0.1681
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0908 - mean_absolute_error: 0.1881 - val_loss: 0.0819 - val_mean_absolute_error: 0.2193
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1299 - mean_absolute_error: 0.2646 - val_loss: 0.0732 - val_mean_absolute_error: 0.1954
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0941 - mean_absolute_error: 0.2075 - val_loss: 0.0983 - val_mean_absolute_error: 0.2434
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0755 - mean_absolute_error: 0.2005 - val_loss: 0.0787 - val_mean_absolute_error: 0.2101
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0778 - mean_absolute_error: 0.1830 - val_loss: 0.0869 - val_mean_absolute_error: 0.1986
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0938 - mean_absolute_error: 0.2162 - val_loss: 0.0972 - val_mean_absolute_error: 0.2437
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0845 - mean_absolute_error: 0.2207 - val_loss: 0.0939 - val_mean_absolute_error: 0.2260
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0896 - mean_absolute_error: 0.2125 - val_loss: 0.0836 - val_mean_absolute_error: 0.1881
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0975 - mean_absolute_error: 0.2391 - val_loss: 0.1007 - val_mean_absolute_error: 0.2180
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1022 - mean_absolute_error: 0.1986 - val_loss: 0.1138 - val_mean_absolute_error: 0.2537
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1519 - mean_absolute_error: 0.2856 - val_loss: 0.1538 - val_mean_absolute_error: 0.2932
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1740 - mean_absolute_error: 0.3070 - val_loss: 0.0720 - val_mean_absolute_error: 0.2071
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1072 - mean_absolute_error: 0.2585 - val_loss: 0.0894 - val_mean_absolute_error: 0.2284
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1322 - mean_absolute_error: 0.2712 - val_loss: 0.0890 - val_mean_absolute_error: 0.2299
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1430 - mean_absolute_error: 0.2595 - val_loss: 0.0721 - val_mean_absolute_error: 0.1887
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1137 - mean_absolute_error: 0.2394 - val_loss: 0.0704 - val_mean_absolute_error: 0.1869
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0858 - mean_absolute_error: 0.2119 - val_loss: 0.0773 - val_mean_absolute_error: 0.2122
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0720 - mean_absolute_error: 0.1989 - val_loss: 0.0612 - val_mean_absolute_error: 0.1743
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0861 - mean_absolute_error: 0.1976 - val_loss: 0.0603 - val_mean_absolute_error: 0.1798
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0841 - mean_absolute_error: 0.2049 - val_loss: 0.0794 - val_mean_absolute_error: 0.1988
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0950 - mean_absolute_error: 0.2114 - val_loss: 0.0748 - val_mean_absolute_error: 0.1953
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1174 - mean_absolute_error: 0.2318 - val_loss: 0.0907 - val_mean_absolute_error: 0.2267
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0972 - mean_absolute_error: 0.2263 - val_loss: 0.0703 - val_mean_absolute_error: 0.1850

FOLD 4
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.0964 - mean_absolute_error: 0.2050 - val_loss: 0.0780 - val_mean_absolute_error: 0.1840
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1984 - mean_absolute_error: 0.3333 - val_loss: 0.1026 - val_mean_absolute_error: 0.2394
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1974 - mean_absolute_error: 0.3285 - val_loss: 0.0916 - val_mean_absolute_error: 0.2501
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1351 - mean_absolute_error: 0.2790 - val_loss: 0.0394 - val_mean_absolute_error: 0.1387
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1320 - mean_absolute_error: 0.2699 - val_loss: 0.0355 - val_mean_absolute_error: 0.1352
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0891 - mean_absolute_error: 0.2005 - val_loss: 0.0591 - val_mean_absolute_error: 0.1616
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1060 - mean_absolute_error: 0.2220 - val_loss: 0.0399 - val_mean_absolute_error: 0.1453
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0813 - mean_absolute_error: 0.2133 - val_loss: 0.0360 - val_mean_absolute_error: 0.1393
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1065 - mean_absolute_error: 0.2203 - val_loss: 0.0556 - val_mean_absolute_error: 0.1640
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0952 - mean_absolute_error: 0.2234 - val_loss: 0.0535 - val_mean_absolute_error: 0.1615
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0977 - mean_absolute_error: 0.2018 - val_loss: 0.0711 - val_mean_absolute_error: 0.1814
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1142 - mean_absolute_error: 0.2231 - val_loss: 0.0640 - val_mean_absolute_error: 0.1679
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1067 - mean_absolute_error: 0.2030 - val_loss: 0.0580 - val_mean_absolute_error: 0.1858
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0948 - mean_absolute_error: 0.2144 - val_loss: 0.0432 - val_mean_absolute_error: 0.1540
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0879 - mean_absolute_error: 0.1890 - val_loss: 0.0684 - val_mean_absolute_error: 0.1829
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1014 - mean_absolute_error: 0.2222 - val_loss: 0.0523 - val_mean_absolute_error: 0.1632
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0869 - mean_absolute_error: 0.2014 - val_loss: 0.0496 - val_mean_absolute_error: 0.1576
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0668 - mean_absolute_error: 0.1743 - val_loss: 0.0408 - val_mean_absolute_error: 0.1439
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0684 - mean_absolute_error: 0.1685 - val_loss: 0.0432 - val_mean_absolute_error: 0.1420
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0840 - mean_absolute_error: 0.1815 - val_loss: 0.0406 - val_mean_absolute_error: 0.1448
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0594 - mean_absolute_error: 0.1659 - val_loss: 0.0407 - val_mean_absolute_error: 0.1407
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0514 - mean_absolute_error: 0.1543 - val_loss: 0.0553 - val_mean_absolute_error: 0.1604
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0793 - mean_absolute_error: 0.2122 - val_loss: 0.0353 - val_mean_absolute_error: 0.1292
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0839 - mean_absolute_error: 0.1954 - val_loss: 0.0356 - val_mean_absolute_error: 0.1385
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0592 - mean_absolute_error: 0.1696 - val_loss: 0.0522 - val_mean_absolute_error: 0.1618
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0548 - mean_absolute_error: 0.1557 - val_loss: 0.0501 - val_mean_absolute_error: 0.1559
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0619 - mean_absolute_error: 0.1699 - val_loss: 0.0352 - val_mean_absolute_error: 0.1384
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0648 - mean_absolute_error: 0.1693 - val_loss: 0.0388 - val_mean_absolute_error: 0.1478
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0930 - mean_absolute_error: 0.1780 - val_loss: 0.0622 - val_mean_absolute_error: 0.1711
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0852 - mean_absolute_error: 0.2273 - val_loss: 0.0436 - val_mean_absolute_error: 0.1414
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0773 - mean_absolute_error: 0.1860 - val_loss: 0.0351 - val_mean_absolute_error: 0.1437
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0754 - mean_absolute_error: 0.1732 - val_loss: 0.0341 - val_mean_absolute_error: 0.1364
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0759 - mean_absolute_error: 0.1950 - val_loss: 0.0384 - val_mean_absolute_error: 0.1431
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0751 - mean_absolute_error: 0.1705 - val_loss: 0.0432 - val_mean_absolute_error: 0.1445
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0627 - mean_absolute_error: 0.1817 - val_loss: 0.0445 - val_mean_absolute_error: 0.1483
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0668 - mean_absolute_error: 0.1916 - val_loss: 0.0438 - val_mean_absolute_error: 0.1532
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0650 - mean_absolute_error: 0.1804 - val_loss: 0.0462 - val_mean_absolute_error: 0.1676
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0837 - mean_absolute_error: 0.1964 - val_loss: 0.0588 - val_mean_absolute_error: 0.1711
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0838 - mean_absolute_error: 0.1998 - val_loss: 0.0721 - val_mean_absolute_error: 0.1858
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0728 - mean_absolute_error: 0.1897 - val_loss: 0.0522 - val_mean_absolute_error: 0.1812
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1061 - mean_absolute_error: 0.2375 - val_loss: 0.0415 - val_mean_absolute_error: 0.1597
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0747 - mean_absolute_error: 0.2006 - val_loss: 0.0490 - val_mean_absolute_error: 0.1521
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0875 - mean_absolute_error: 0.1975 - val_loss: 0.0510 - val_mean_absolute_error: 0.1717
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0979 - mean_absolute_error: 0.2144 - val_loss: 0.0359 - val_mean_absolute_error: 0.1416
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0424 - mean_absolute_error: 0.1439 - val_loss: 0.0429 - val_mean_absolute_error: 0.1519
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1079 - mean_absolute_error: 0.2142 - val_loss: 0.0549 - val_mean_absolute_error: 0.1773
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0683 - mean_absolute_error: 0.1829 - val_loss: 0.0576 - val_mean_absolute_error: 0.1836
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1174 - mean_absolute_error: 0.2459 - val_loss: 0.0565 - val_mean_absolute_error: 0.1659
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0785 - mean_absolute_error: 0.1833 - val_loss: 0.0417 - val_mean_absolute_error: 0.1600
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0596 - mean_absolute_error: 0.1790 - val_loss: 0.0685 - val_mean_absolute_error: 0.2041

FOLD 5
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - loss: 0.0798 - mean_absolute_error: 0.2071 - val_loss: 0.0551 - val_mean_absolute_error: 0.1826
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0565 - mean_absolute_error: 0.1648 - val_loss: 0.0404 - val_mean_absolute_error: 0.1486
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0748 - mean_absolute_error: 0.1872 - val_loss: 0.0257 - val_mean_absolute_error: 0.1158
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0660 - mean_absolute_error: 0.1794 - val_loss: 0.0321 - val_mean_absolute_error: 0.1248
Epoch 5/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.0749 - mean_absolute_error: 0.2136
2025-08-09 17:04:15.833284: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:04:15.833609: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0708 - mean_absolute_error: 0.1992 - val_loss: 0.0768 - val_mean_absolute_error: 0.2015
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0603 - mean_absolute_error: 0.1632 - val_loss: 0.0640 - val_mean_absolute_error: 0.1804
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0470 - mean_absolute_error: 0.1498 - val_loss: 0.0235 - val_mean_absolute_error: 0.1045
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0581 - mean_absolute_error: 0.1627 - val_loss: 0.0238 - val_mean_absolute_error: 0.1043
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0467 - mean_absolute_error: 0.1573 - val_loss: 0.0405 - val_mean_absolute_error: 0.1532
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0698 - mean_absolute_error: 0.1663 - val_loss: 0.0735 - val_mean_absolute_error: 0.1932
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0430 - mean_absolute_error: 0.1387 - val_loss: 0.0466 - val_mean_absolute_error: 0.1538
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0582 - mean_absolute_error: 0.1576 - val_loss: 0.0229 - val_mean_absolute_error: 0.1002
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0659 - mean_absolute_error: 0.1652 - val_loss: 0.0480 - val_mean_absolute_error: 0.1573
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 65ms/step - loss: 0.0576 - mean_absolute_error: 0.1697 - val_loss: 0.0469 - val_mean_absolute_error: 0.1671
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0390 - mean_absolute_error: 0.1316 - val_loss: 0.0270 - val_mean_absolute_error: 0.1113
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0431 - mean_absolute_error: 0.1348 - val_loss: 0.0294 - val_mean_absolute_error: 0.1164
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0364 - mean_absolute_error: 0.1340 - val_loss: 0.0328 - val_mean_absolute_error: 0.1232
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0880 - mean_absolute_error: 0.1923 - val_loss: 0.0409 - val_mean_absolute_error: 0.1566
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0581 - mean_absolute_error: 0.1578 - val_loss: 0.0386 - val_mean_absolute_error: 0.1443
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0566 - mean_absolute_error: 0.1625 - val_loss: 0.0509 - val_mean_absolute_error: 0.1713
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0634 - mean_absolute_error: 0.1732 - val_loss: 0.0346 - val_mean_absolute_error: 0.1315
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.0595 - mean_absolute_error: 0.1569 - val_loss: 0.0429 - val_mean_absolute_error: 0.1558
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.0457 - mean_absolute_error: 0.1480 - val_loss: 0.0667 - val_mean_absolute_error: 0.2168
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.0723 - mean_absolute_error: 0.1884 - val_loss: 0.0595 - val_mean_absolute_error: 0.2025
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 58ms/step - loss: 0.0434 - mean_absolute_error: 0.1431 - val_loss: 0.0295 - val_mean_absolute_error: 0.1185
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0486 - mean_absolute_error: 0.1563 - val_loss: 0.0367 - val_mean_absolute_error: 0.1481
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0445 - mean_absolute_error: 0.1356 - val_loss: 0.0442 - val_mean_absolute_error: 0.1681
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0885 - mean_absolute_error: 0.1509 - val_loss: 0.0387 - val_mean_absolute_error: 0.1517
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.0545 - mean_absolute_error: 0.1723 - val_loss: 0.0419 - val_mean_absolute_error: 0.1617
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - loss: 0.0867 - mean_absolute_error: 0.1738 - val_loss: 0.0440 - val_mean_absolute_error: 0.1587
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.0752 - mean_absolute_error: 0.1767 - val_loss: 0.0256 - val_mean_absolute_error: 0.1082
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0417 - mean_absolute_error: 0.1317 - val_loss: 0.0214 - val_mean_absolute_error: 0.1023
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0564 - mean_absolute_error: 0.1492 - val_loss: 0.0358 - val_mean_absolute_error: 0.1376
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0569 - mean_absolute_error: 0.1595 - val_loss: 0.0539 - val_mean_absolute_error: 0.1816
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - loss: 0.0541 - mean_absolute_error: 0.1420 - val_loss: 0.0774 - val_mean_absolute_error: 0.2076
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0772 - mean_absolute_error: 0.2085 - val_loss: 0.0544 - val_mean_absolute_error: 0.1870
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0533 - mean_absolute_error: 0.1572 - val_loss: 0.0596 - val_mean_absolute_error: 0.1952
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0585 - mean_absolute_error: 0.1750 - val_loss: 0.0277 - val_mean_absolute_error: 0.1160
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0494 - mean_absolute_error: 0.1522 - val_loss: 0.0672 - val_mean_absolute_error: 0.2132
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0530 - mean_absolute_error: 0.1657 - val_loss: 0.0938 - val_mean_absolute_error: 0.2536
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0552 - mean_absolute_error: 0.1658 - val_loss: 0.0374 - val_mean_absolute_error: 0.1448
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0622 - mean_absolute_error: 0.1552 - val_loss: 0.0229 - val_mean_absolute_error: 0.1085
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0467 - mean_absolute_error: 0.1449 - val_loss: 0.0424 - val_mean_absolute_error: 0.1633
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0440 - mean_absolute_error: 0.1419 - val_loss: 0.0649 - val_mean_absolute_error: 0.2092
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0561 - mean_absolute_error: 0.1488 - val_loss: 0.0444 - val_mean_absolute_error: 0.1585
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0632 - mean_absolute_error: 0.1594 - val_loss: 0.0401 - val_mean_absolute_error: 0.1502
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0416 - mean_absolute_error: 0.1436 - val_loss: 0.0769 - val_mean_absolute_error: 0.2309
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0521 - mean_absolute_error: 0.1461 - val_loss: 0.0354 - val_mean_absolute_error: 0.1380
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0581 - mean_absolute_error: 0.1660 - val_loss: 0.0380 - val_mean_absolute_error: 0.1458
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0379 - mean_absolute_error: 0.1353 - val_loss: 0.0785 - val_mean_absolute_error: 0.2359
Validation losses: [0.19283302128314972, 0.12924645841121674, 0.070262610912323, 0.06846288591623306, 0.0784914493560791]
HPS: {'player_emb_dim': 32, 'dense_units': 16, 'dense_units_2': 112, 'learning_rate': 0.001, 'dropout_rate': 0.2} Avg. across folds score(MSE): 0.09413674771785736
HPS: {'player_emb_dim': 16, 'dense_units': 96, 'dense_units_2': 32, 'learning_rate': 0.01, 'dropout_rate': 0.2} Avg. across folds score(MSE): 0.07665762156248093
HPS: {'player_emb_dim': 16, 'dense_units': 96, 'dense_units_2': 80, 'learning_rate': 0.01, 'dropout_rate': 0.1} Avg. across folds score(MSE): 0.07741621658205985
HPS: {'player_emb_dim': 8, 'dense_units': 112, 'dense_units_2': 96, 'learning_rate': 0.01, 'dropout_rate': 0.4} Avg. across folds score(MSE): 0.11440064609050751
HPS: {'player_emb_dim': 8, 'dense_units': 64, 'dense_units_2': 80, 'learning_rate': 0.01, 'dropout_rate': 0.30000000000000004} Avg. across folds score(MSE): 0.10785928517580032
HPS: {'player_emb_dim': 8, 'dense_units': 112, 'dense_units_2': 96, 'learning_rate': 0.01, 'dropout_rate': 0.4}. Avg MSE: 0.11440064609050751.
Epoch 1/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 1s 6ms/step - loss: 1.2772 - mean_absolute_error: 0.8964
Epoch 2/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 2.5427 - mean_absolute_error: 1.3017 
Epoch 3/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 1.0353 - mean_absolute_error: 0.7746 
Epoch 4/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.8112 - mean_absolute_error: 0.7004 
Epoch 5/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.5706 - mean_absolute_error: 0.6033 
Epoch 6/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.6755 - mean_absolute_error: 0.6733 
Epoch 7/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.4798 - mean_absolute_error: 0.5331 
Epoch 8/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.4053 - mean_absolute_error: 0.4897 
Epoch 9/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2731 - mean_absolute_error: 0.4097 
Epoch 10/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.3992 - mean_absolute_error: 0.4892 
Epoch 11/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2421 - mean_absolute_error: 0.3522 
Epoch 12/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.3017 - mean_absolute_error: 0.4345 
Epoch 13/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2699 - mean_absolute_error: 0.4036 
Epoch 14/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2780 - mean_absolute_error: 0.3780 
Epoch 15/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2788 - mean_absolute_error: 0.3908 
Epoch 16/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1723 - mean_absolute_error: 0.3054 
Epoch 17/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2004 - mean_absolute_error: 0.3012 
Epoch 18/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1766 - mean_absolute_error: 0.2871 
Epoch 19/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1590 - mean_absolute_error: 0.2599 
Epoch 20/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1680 - mean_absolute_error: 0.2976 
Epoch 21/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1701 - mean_absolute_error: 0.2841 
Epoch 22/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1498 - mean_absolute_error: 0.2819 
Epoch 23/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2268 - mean_absolute_error: 0.3398 
Epoch 24/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.1710 - mean_absolute_error: 0.2960 
Epoch 25/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.2077 - mean_absolute_error: 0.3199 
Epoch 26/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.2154 - mean_absolute_error: 0.3423 
Epoch 27/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.2695 - mean_absolute_error: 0.3620 
Epoch 28/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.2361 - mean_absolute_error: 0.3680 
Epoch 29/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.2092 - mean_absolute_error: 0.3190 
Epoch 30/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.2107 - mean_absolute_error: 0.3374 
Epoch 31/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.2474 - mean_absolute_error: 0.3525 
Epoch 32/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.2351 - mean_absolute_error: 0.3641 
Epoch 33/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1848 - mean_absolute_error: 0.3176 
Epoch 34/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1969 - mean_absolute_error: 0.3247 
Epoch 35/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1606 - mean_absolute_error: 0.2854 
Epoch 36/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1663 - mean_absolute_error: 0.2958 
Epoch 37/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1797 - mean_absolute_error: 0.2839 
Epoch 38/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1624 - mean_absolute_error: 0.2827 
Epoch 39/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1771 - mean_absolute_error: 0.3004 
Epoch 40/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1932 - mean_absolute_error: 0.3303 
Epoch 41/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1611 - mean_absolute_error: 0.2872 
Epoch 42/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1743 - mean_absolute_error: 0.2873 
Epoch 43/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1942 - mean_absolute_error: 0.2906 
Epoch 44/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1384 - mean_absolute_error: 0.2701 
Epoch 45/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1570 - mean_absolute_error: 0.2925 
Epoch 46/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1628 - mean_absolute_error: 0.3025 
Epoch 47/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1681 - mean_absolute_error: 0.2979 
Epoch 48/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0917 - mean_absolute_error: 0.2051 
Epoch 49/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1249 - mean_absolute_error: 0.2441 
Epoch 50/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1723 - mean_absolute_error: 0.2842 

Plotting results#

def analyze_players_embeddings(model, player_strengths, marker_labels = None, random_state = None):
    player_embedding_layer = model.get_layer("player_embedding")
    embeddings = player_embedding_layer.get_weights()[0]
    
    embeddings = embeddings[1:, :]
    player_strengths = player_strengths[1:]

    fig, embeddings_3d  = umap_and_visualize(embeddings, player_strengths, marker_labels, random_state=random_state)
    
    avg_corr = compute_correlations_for_projected_dims(embeddings, player_strengths)
    avg_3d_corr = compute_correlations_for_projected_dims(embeddings_3d, player_strengths)
     
    return fig, embeddings_3d, avg_corr
fig, _, _ = analyze_players_embeddings(model_hp, player_strengths)
fig
player_strengths.shape: (30,)
embeddings_nd[:, 0].shape : (30,)
Embeddings shape: (30, 32)
Dimension 1 correlation with base strengths: r = 0.1785, p-value = 0.3453
Dimension 2 correlation with base strengths: r = -0.0050, p-value = 0.9792
Dimension 3 correlation with base strengths: r = -0.1024, p-value = 0.5901
Dimension 4 correlation with base strengths: r = 0.2244, p-value = 0.2331
Dimension 5 correlation with base strengths: r = -0.5805, p-value = 0.000771
Dimension 6 correlation with base strengths: r = 0.0944, p-value = 0.6196
Dimension 7 correlation with base strengths: r = 0.4194, p-value = 0.02105
Dimension 8 correlation with base strengths: r = -0.4953, p-value = 0.005388
Dimension 9 correlation with base strengths: r = 0.3125, p-value = 0.09271
Dimension 10 correlation with base strengths: r = -0.7211, p-value = 6.955e-06
Dimension 11 correlation with base strengths: r = -0.5422, p-value = 0.001969
Dimension 12 correlation with base strengths: r = 0.5153, p-value = 0.003565
Dimension 13 correlation with base strengths: r = 0.4203, p-value = 0.02074
Dimension 14 correlation with base strengths: r = 0.2815, p-value = 0.1318
Dimension 15 correlation with base strengths: r = 0.6655, p-value = 5.995e-05
Dimension 16 correlation with base strengths: r = -0.1608, p-value = 0.3961
Dimension 17 correlation with base strengths: r = 0.2720, p-value = 0.146
Dimension 18 correlation with base strengths: r = 0.1385, p-value = 0.4653
Dimension 19 correlation with base strengths: r = -0.3629, p-value = 0.0487
Dimension 20 correlation with base strengths: r = 0.4603, p-value = 0.01048
Dimension 21 correlation with base strengths: r = -0.6484, p-value = 0.0001066
Dimension 22 correlation with base strengths: r = -0.5130, p-value = 0.003743
Dimension 23 correlation with base strengths: r = 0.1228, p-value = 0.5179
Dimension 24 correlation with base strengths: r = -0.0404, p-value = 0.832
Dimension 25 correlation with base strengths: r = -0.2752, p-value = 0.1411
Dimension 26 correlation with base strengths: r = 0.0748, p-value = 0.6943
Dimension 27 correlation with base strengths: r = 0.3721, p-value = 0.04287
Dimension 28 correlation with base strengths: r = 0.4475, p-value = 0.01314
Dimension 29 correlation with base strengths: r = -0.3536, p-value = 0.05526
Dimension 30 correlation with base strengths: r = -0.3668, p-value = 0.0462
Dimension 31 correlation with base strengths: r = -0.4439, p-value = 0.01399
Dimension 32 correlation with base strengths: r = 0.6026, p-value = 0.0004249
Average absolute correlation across 32 components: 0.3504
player_strengths.shape: (30,)
embeddings_nd[:, 0].shape : (30,)
Embeddings shape: (30, 3)
Dimension 1 correlation with base strengths: r = -0.8946, p-value = 2.656e-11
Dimension 2 correlation with base strengths: r = 0.2358, p-value = 0.2096
Dimension 3 correlation with base strengths: r = -0.7479, p-value = 2.024e-06
Average absolute correlation across 3 components: 0.6261

Adding interactions between players#

Pairwise interactions function#

def pairwise_interactions(embeddings, mask, pairwise_dense, debug = False):
    """
    embeddings: Tensor of shape (batch, seq_len, emb_dim)
    mask: Boolean tensor of shape (batch, seq_len) indicating valid tokens (not padded)
    
    Returns:
    pooled vector representing aggregated pairwise interactions, shape (batch, interaction_dim)
    """
    
    print(f"="*40 + f" Starting pairwise interaction computation " + f"="*50)  
    
    batch_size = tf.shape(embeddings)[0]
    seq_len = tf.shape(embeddings)[1]
    emb_dim = tf.shape(embeddings)[2]
    

    # Create indices (i,j) of all pairs i<j
    # We'll generate pair indices dynamically for maximum sequence length:
    i_indices = []
    j_indices = []
    def batch_pair_indices(batch_size, seq_len):
        """
        Generate pair indices (i, j) for each element in batch,
        where i < j and i, j in [0, seq_len).

        Args:
        batch_size: integer, batch size
        seq_len: integer, length of the sequence dimension (same for all batch)

        Returns:
        i_indices: int32 tensor of shape (batch_size, num_pairs)
        j_indices: int32 tensor of shape (batch_size, num_pairs)

        where num_pairs = seq_len * (seq_len - 1) // 2
        """
        # Create 1D range for sequence length
        idx = tf.range(seq_len)
        print("idx shape:", idx.shape)

        # Create meshgrid i and j indices over seq_len
        i, j = tf.meshgrid(idx, idx)  # shape (seq_len, seq_len)
        print("i shape:", i.shape)

        # Mask upper triangular matrix indices where i < j
        upper_tri_mask = tf.linalg.band_part(tf.ones((seq_len, seq_len), dtype=tf.bool), 0, -1) & \
                        ~tf.linalg.band_part(tf.ones((seq_len, seq_len), dtype=tf.bool), 0, 0)
                                      
        i_pairs = tf.boolean_mask(i, upper_tri_mask)  # shape (num_pairs,)
        j_pairs = tf.boolean_mask(j, upper_tri_mask)  # shape (num_pairs,)
        
        # Tile this to batch dimension
        i_batch = tf.tile(tf.expand_dims(i_pairs, 0), [batch_size, 1])  # shape (batch_size, num_pairs)
        j_batch = tf.tile(tf.expand_dims(j_pairs, 0), [batch_size, 1])  # shape (batch_size, num_pairs)

        return i_batch, j_batch
    
    
    i_indices, j_indices = batch_pair_indices(batch_size, seq_len)

    if debug:
        print("batch_size shape:", batch_size.shape)
        print("i_indices shape:", i_indices.shape)

    # Gather pairs embeddings for each (i,j) index
    # embeddings shape: (batch, seq_len, emb_dim)
    emb_i = tf.gather(embeddings, i_indices, axis=1, batch_dims=1)  # (batch, num_pairs, emb_dim)
    emb_j = tf.gather(embeddings, j_indices, axis=1, batch_dims=1)  # (batch, num_pairs, emb_dim)

    # Gather mask values for i and j positions
    mask_i = tf.gather(mask, i_indices, axis=1, batch_dims=1)  # (batch, num_pairs)
    mask_j = tf.gather(mask, j_indices, axis=1, batch_dims=1)  # (batch, num_pairs)

    # Only keep pairs where both positions are valid (not padded)
    valid_pairs_mask = tf.logical_and(mask_i, mask_j)  # (batch, num_pairs)

    # Concatenate pairwise embeddings
    pair_emb = tf.concat([emb_i, emb_j], axis=-1)  # (batch, num_pairs, 2*emb_dim)
    
    # Apply dense layer to pairs
    interaction = pairwise_dense(pair_emb)  # (batch, num_pairs, n_dense_units)
    
    # Mask out invalid pairs by setting to zero
    valid_pairs_mask_expanded = tf.cast(tf.expand_dims(valid_pairs_mask, axis=-1), dtype=tf.float32)
    if debug: 
        print("pairwise:valid_pairs_mask_expanded", valid_pairs_mask_expanded.shape)  
    
    interaction_masked = interaction * valid_pairs_mask_expanded  # zero out invalid pairs
    if debug: 
        print("pairwise:interaction_masked", interaction_masked.shape)  

    # Perform reduction carefully not taking into account padded zeroes and excluding those from the denominator.
    # Sum or average valid pairs per batch
    valid_counts = tf.reduce_sum(valid_pairs_mask_expanded, axis=1)  # (batch, 1)
    
    pooled = tf.reduce_sum(interaction_masked, axis=1) / (valid_counts + 1e-8)  # avoid div-by-zero, (batch, 64)
    
    # Wrong version of pooling:
    # pooled = tf.reduce_mean(interaction_masked, axis=1)  # shape: (batch, interaction_dim)
    #  This calculates the mean over all pairs on axis 1.

    # If interaction_masked already has invalid pairs zeroed out (because you multiplied by valid_pairs_mask_expanded), those zeros are still included in the mean.

    # That means padded/masked pairs contribute 0 but still count in the divisor.

    # As a result, if some batch elements have many padded pairs, their average will be artificially lower because the zeros drag down the mean.
    print("pairwise:pooled", pooled.shape)  
 
    # Now we are returning shape (batch, interaction_dim) as we reduced along num_pairs (combinations of players).
    # Why can't we retunr (batch, num_pairs) here by reducing witin embeddings of each paid? 
    # The reason appears to be the fact that dense_1 layer will "instantiate"/remember the shape from the first
    # combination of hyperparameters ( --> interaction_dim <--, n_dense_units) and complain on the second iteration.
    # At runtime we are actually concatenating very different shapes (None, 32)+(None,32)+None,64) and it is fine. 
    # Why can't we concatenate X dimensions (None, X)+(None,32)+None,64) I don't yet unerstand. Maybe it has something 
    # to do with how keras tuner caches layers' shapes? It shouldn't as its purpose to search over different parameters 
    # and hense different architectures and shapes.
    
    return pooled 

Debuggin pairwise interactions#

  
from itertools import combinations
    
def test_pairwise():
    batch_size = 2
    seq_len = 4
    emb_dim = 3
    interaction_dim = 5

    # Create deterministic embeddings so you can inspect output
    embeddings = tf.constant([
        [  # Batch 0
            [1, 2, 3],      # Player 0
            [4, 5, 6],      # Player 1
            [7, 8, 9],      # Player 2
            [0, 0, 0],   # Player 3
        ],
        [  # Batch 1
            [13, 14, 15],   # Player 0
            [16, 17, 18],   # Player 1
            [0, 0, 0],   # Player 2
            [0, 0, 0],   # Player 3
        ]
    ], dtype=tf.float32) 
  

    # Example array
    indices = [(0,1), (0,2), (0,3), (1,2), (1,3), (2,3)] # exclude 3 as it is padded one
    indices = [(0,1), (0,2), (1,2)] # exclude 3 as it is padded one
    # arr = embeddings[0].numpy()
    # print(arr.shape)

    # # Get all 2-element combinations (as tuples)
    # combos = list(combinations(arr, 2))

    # print(f"Combinations: {combos}")
    
    mask = tf.constant([
        [True, True, True, False],  # Batch 0
        [True, True, False, False]    # Batch 1
    ], dtype=tf.bool)
    
    pairwise_dense = layers.Dense(interaction_dim, activation='relu', name='pairwise_dense')
    
    result = pairwise_interactions(embeddings, mask, pairwise_dense)
    return result
res = test_pairwise()
res
======================================== Starting pairwise interaction computation ==================================================
idx shape: (4,)
i shape: (4, 4)
pairwise:pooled (2, 5)
<tf.Tensor: shape=(2, 5), dtype=float32, numpy=
array([[ 0.      ,  4.794046,  0.      ,  9.443652,  0.      ],
       [ 0.      , 10.371949,  0.      , 24.933601,  0.      ]],
      dtype=float32)>

<tf.Tensor: shape=(2, 6, 6), dtype=float32, numpy= array([[[ 4., 5., 6., 1., 2., 3.], [ 7., 8., 9., 1., 2., 3.], [ 0., 0., 0., 0., 0., 0.], [ 7., 8., 9., 4., 5., 6.], [ 0., 0., 0., 0., 0., 0.], [ 0., 0., 0., 0., 0., 0.]],

   [[16., 17., 18., 13., 14., 15.],
    [ 0.,  0.,  0.,  0.,  0.,  0.],
    [ 0.,  0.,  0.,  0.,  0.,  0.],
    [ 0.,  0.,  0.,  0.,  0.,  0.],
    [ 0.,  0.,  0.,  0.,  0.,  0.],
    [ 0.,  0.,  0.,  0.,  0.,  0.]]], dtype=float32)>

Build interaction model#

def build_model_inter(hp):
    
    attention_heads = hp.Int('attention_heads', min_value=1, max_value=4, step=1)
    attention_dropout_rate = hp.Float('attention_dropout_rate', 0.1, 0.4, step=0.1)
    
    player_emb_dim = hp.Choice('player_emb_dim', [ 32]) #[8, 16, 32]
    dense_units = hp.Int('dense_units', min_value=16, max_value=128, step=16)
    dense_units_2 = hp.Int('dense_units_2', min_value=16, max_value=128, step=16)
    learning_rate = hp.Choice('learning_rate', [1e-2, 1e-3, 1e-4])
    dropout_rate = hp.Float('dropout_rate', 0.1, 0.4, step=0.1)
    dropout_rate_2 = hp.Float('dropout_rate_2', 0.1, 0.4, step=0.1)
    dropout_rate_inter = hp.Float('dropout_rate_inter', 0.1, 0.4, step=0.1)
    interaction_scale = hp.Int('interaction_scale', min_value=2, max_value=4, step=1)
    
    for name, value in hp.values.items():
        print(f"  {name}: {value}")
        
     # Pairwise interaction pooling vectors ( used in two places )
    interaction_dim = player_emb_dim * interaction_scale
    
    # Inputs: variable-length teams
    teamA_input = Input(batch_shape=(None, 9, ), dtype='int32', name='teamA')  
    teamB_input = Input(batch_shape=(None, 9, ), dtype='int32', name='teamB')  
    # teamA_input = Input(shape=(None, ), dtype='int32', name='teamA')  
    # teamB_input = Input(shape=(None, ), dtype='int32', name='teamB')  
    
    
    # Embedding layer with mask support
    player_embedding = layers.Embedding(
        input_dim=NUM_PLAYERS + 1,  # includes 0 for mask
        output_dim=player_emb_dim,
        embeddings_initializer=initializers.GlorotUniform(seed=seed_value),
        mask_zero=True,  # Important: enables automatic masking for padding (0 as pad token)
        # embeddings_regularizer=tf.keras.regularizers.l2(1e-4),
        name='player_embedding'
    )
    
    # Embed team players
    teamA_embeds = player_embedding(teamA_input)  # shape: (batch, teamA_len, emb_dim)
    teamB_embeds = player_embedding(teamB_input)

    
    # Self-attention block (respects masks automatically if using Functional API)
    def self_attention_block(x, name_prefix=''):
        attn_output = layers.MultiHeadAttention(
            num_heads=attention_heads,
            key_dim=player_emb_dim,
            dropout=attention_dropout_rate,
            name=f'{name_prefix}_attn'
        )(x, x)
        x = layers.Add(name=f'{name_prefix}_residual')([x, attn_output])
        x = layers.LayerNormalization(name=f'{name_prefix}_norm')(x)
        return x
    
    # Apply attention
    teamA_attn = self_attention_block(teamA_embeds, 'teamA')
    teamB_attn = self_attention_block(teamB_embeds, 'teamB')
    
    print("teamA_embeds.shape", teamA_embeds.shape) 
    print("teamA_attn.shape", teamA_attn.shape) 
    
    
    # Global average pooling over valid (non-padded) tokens
    # TF handles masking automatically in GlobalAveragePooling1D if mask_zero=True
    teamA_vector = layers.GlobalAveragePooling1D(name='teamA_avgpool')(teamA_attn)
    teamB_vector = layers.GlobalAveragePooling1D(name='teamB_avgpool')(teamB_attn)
    print("teamA_vector_pooled.shape", teamA_vector.shape) 
    
    pairwise_dense = layers.Dense(interaction_dim, activation='relu', name='pairwise_dense')
     # Extract mask from embeddings (automatically created by Embedding with mask_zero=True)
    teamA_mask = teamA_embeds._keras_mask  # boolean tensor (batch, teamA_len)
    teamB_mask = teamB_embeds._keras_mask
    
    print("teamA_mask.shape", teamA_mask.shape) 
    
    teamA_pairwise = layers.Lambda(lambda x: pairwise_interactions(x[0], x[1], pairwise_dense), output_shape=(interaction_dim,), name='teamA_pairwise')([teamA_embeds, teamA_mask])
    teamA_pairwise = layers.Dropout(dropout_rate_inter)(teamA_pairwise)
    teamB_pairwise = layers.Lambda(lambda x: pairwise_interactions(x[0], x[1], pairwise_dense), output_shape=(interaction_dim,), name='teamB_pairwise')([teamB_embeds, teamB_mask])
    teamB_pairwise = layers.Dropout(dropout_rate_inter)(teamB_pairwise)
   

    # Combine attention pooled vector and pairwise interaction
    teamA_combined = layers.Concatenate(name='teamA_combined')([teamA_vector, teamA_pairwise])  # (batch, emb_dim + 64)
    teamB_combined = layers.Concatenate(name='teamB_combined')([teamB_vector, teamB_pairwise])
    
    # # Matchup modeling (difference vector)
    matchup_vector = layers.Subtract(name='matchup_diff')([teamA_combined, teamB_combined])
       
    print("teamA_pairwise.shape", teamA_pairwise.shape)    
    print("teamA_combined.shape", teamA_combined.shape)  
    print("teamB_combined.shape", teamB_combined.shape)  
    print("matchup_vector.shape", matchup_vector.shape)  
    
    # Concatenate summary representation
    match_input = layers.Concatenate(name='match_features')([teamA_combined, teamB_combined, matchup_vector])
    
    
    print("match_input.shape", match_input.shape)  

    x = layers.Dense(dense_units, activation='relu', kernel_regularizer=tf.keras.regularizers.l2(1e-4), name = "dense_1")(match_input)
    x = layers.Dropout(dropout_rate)(x)
    x = layers.Dense(dense_units_2, activation='relu', kernel_regularizer=tf.keras.regularizers.l2(1e-4), name = "dense_2")(x)
    x = layers.Dropout(dropout_rate_2)(x)
    
    outcome = layers.Dense(1, activation='linear', name='game_outcome')(x)
    print("Outcome")  

    # Final model
    model = Model(inputs=[teamA_input, teamB_input], outputs=[outcome])
    model.compile(optimizer=tf.keras.optimizers.Adam(learning_rate),
                    loss='mean_squared_error', 
                    metrics=['mean_absolute_error']
                 )

    # model.summary()
    return model

Run interaction model#

from tensorflow.keras.callbacks import EarlyStopping

early_stop = EarlyStopping(monitor='val_loss', patience=5, restore_best_weights=True)

# Define a learning rate schedule function (step decay example)
def lr_schedule(epoch, lr):
    drop_rate = 0.5
    epochs_drop = 10
    if epoch > 0 and epoch % epochs_drop == 0:
        return lr * drop_rate
    return lr

#Instantiate callback
lr_scheduler = tf.keras.callbacks.LearningRateScheduler(lr_schedule)

# Or adaptive reduction on plateau (reduce LR when val_loss stalls)
reduce_lr = tf.keras.callbacks.ReduceLROnPlateau(monitor='val_loss', factor=0.5, patience=5, min_lr=1e-6)

es_callbacks=[lr_scheduler, reduce_lr, early_stop]


all_best_hps_inter = hyperparameter_search(build_model_inter, max_trials=10, callbacks=es_callbacks)
teamA_data shape: (100, 9)
teamB_data shape: (100, 9)
outcomes shape: (100,)

FOLD 1
  player_emb_dim: 32
  dense_units: 16
  dense_units_2: 16
  learning_rate: 0.01
  dropout_rate: 0.1
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.1
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
/opt/anaconda3/envs/footballman/lib/python3.12/site-packages/keras/src/layers/layer.py:939: UserWarning:

Layer 'teamA_pairwise' (of type Lambda) was passed an input with a mask attached to it. However, this layer does not support masking and will therefore destroy the mask information. Downstream layers will not see the mask.

/opt/anaconda3/envs/footballman/lib/python3.12/site-packages/keras/src/layers/layer.py:939: UserWarning:

Layer 'teamB_pairwise' (of type Lambda) was passed an input with a mask attached to it. However, this layer does not support masking and will therefore destroy the mask information. Downstream layers will not see the mask.
  player_emb_dim: 32
  dense_units: 128
  dense_units_2: 112
  learning_rate: 0.001
  dropout_rate: 0.2
  dropout_rate_2: 0.4
  dropout_rate_inter: 0.4
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
2025-08-09 17:04:25.169462: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:04:25.169855: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
  player_emb_dim: 32
  dense_units: 64
  dense_units_2: 48
  learning_rate: 0.0001
  dropout_rate: 0.30000000000000004
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
  player_emb_dim: 32
  dense_units: 112
  dense_units_2: 112
  learning_rate: 0.001
  dropout_rate: 0.30000000000000004
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
2025-08-09 17:04:36.001502: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:04:36.001860: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
  player_emb_dim: 32
  dense_units: 16
  dense_units_2: 48
  learning_rate: 0.01
  dropout_rate: 0.4
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
  player_emb_dim: 32
  dense_units: 48
  dense_units_2: 16
  learning_rate: 0.01
  dropout_rate: 0.1
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.2
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
2025-08-09 17:04:44.865959: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:04:44.866319: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
  player_emb_dim: 32
  dense_units: 16
  dense_units_2: 64
  learning_rate: 0.01
  dropout_rate: 0.4
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.2
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
2025-08-09 17:04:50.281962: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:04:50.282331: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
  player_emb_dim: 32
  dense_units: 16
  dense_units_2: 32
  learning_rate: 0.0001
  dropout_rate: 0.4
  dropout_rate_2: 0.4
  dropout_rate_inter: 0.4
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
  player_emb_dim: 32
  dense_units: 80
  dense_units_2: 32
  learning_rate: 0.01
  dropout_rate: 0.4
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.1
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
2025-08-09 17:05:01.163017: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:05:01.163381: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
  player_emb_dim: 32
  dense_units: 48
  dense_units_2: 128
  learning_rate: 0.0001
  dropout_rate: 0.1
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.2
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
  player_emb_dim: 32
  dense_units: 64
  dense_units_2: 16
  learning_rate: 0.0001
  dropout_rate: 0.4
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
2025-08-09 17:05:11.629586: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:05:11.629943: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
  player_emb_dim: 32
  dense_units: 48
  dense_units_2: 16
  learning_rate: 0.01
  dropout_rate: 0.1
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.2
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
/opt/anaconda3/envs/footballman/lib/python3.12/site-packages/keras/src/saving/saving_lib.py:757: UserWarning:

Skipping variable loading for optimizer 'adam', because it has 2 variables whereas the saved optimizer has 56 variables. 
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 398ms/step - loss: 0.2271 - mean_absolute_error: 0.3643
1: 0.2271081507205963     2: 0.36433491110801697  
0.2271081507205963

FOLD 2
  player_emb_dim: 32
  dense_units: 16
  dense_units_2: 16
  learning_rate: 0.01
  dropout_rate: 0.1
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.1
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
  player_emb_dim: 32
  dense_units: 112
  dense_units_2: 16
  learning_rate: 0.001
  dropout_rate: 0.2
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
2025-08-09 17:05:19.296061: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:05:19.296431: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
  player_emb_dim: 32
  dense_units: 64
  dense_units_2: 80
  learning_rate: 0.001
  dropout_rate: 0.1
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.1
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
2025-08-09 17:05:24.504807: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:05:24.505179: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
  player_emb_dim: 32
  dense_units: 48
  dense_units_2: 96
  learning_rate: 0.0001
  dropout_rate: 0.1
  dropout_rate_2: 0.4
  dropout_rate_inter: 0.2
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
  player_emb_dim: 32
  dense_units: 64
  dense_units_2: 64
  learning_rate: 0.0001
  dropout_rate: 0.30000000000000004
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.4
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
2025-08-09 17:05:35.707709: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:05:35.707989: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
  player_emb_dim: 32
  dense_units: 48
  dense_units_2: 48
  learning_rate: 0.0001
  dropout_rate: 0.30000000000000004
  dropout_rate_2: 0.4
  dropout_rate_inter: 0.1
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
2025-08-09 17:05:42.084791: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:05:42.085131: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
  player_emb_dim: 32
  dense_units: 48
  dense_units_2: 128
  learning_rate: 0.0001
  dropout_rate: 0.1
  dropout_rate_2: 0.4
  dropout_rate_inter: 0.2
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
2025-08-09 17:05:49.291958: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:05:49.292378: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
  player_emb_dim: 32
  dense_units: 64
  dense_units_2: 64
  learning_rate: 0.0001
  dropout_rate: 0.30000000000000004
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.1
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
2025-08-09 17:05:56.023794: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:05:56.024149: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
  player_emb_dim: 32
  dense_units: 96
  dense_units_2: 16
  learning_rate: 0.01
  dropout_rate: 0.2
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
2025-08-09 17:06:02.403152: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:06:02.403487: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
  player_emb_dim: 32
  dense_units: 128
  dense_units_2: 32
  learning_rate: 0.0001
  dropout_rate: 0.1
  dropout_rate_2: 0.4
  dropout_rate_inter: 0.2
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
  player_emb_dim: 32
  dense_units: 32
  dense_units_2: 112
  learning_rate: 0.001
  dropout_rate: 0.4
  dropout_rate_2: 0.4
  dropout_rate_inter: 0.2
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
2025-08-09 17:06:12.763807: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:06:12.764161: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
  player_emb_dim: 32
  dense_units: 64
  dense_units_2: 80
  learning_rate: 0.001
  dropout_rate: 0.1
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.1
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 395ms/step - loss: 0.2381 - mean_absolute_error: 0.3499
1: 0.2380724847316742     2: 0.34990328550338745  
0.2380724847316742

FOLD 3
  player_emb_dim: 32
  dense_units: 16
  dense_units_2: 16
  learning_rate: 0.01
  dropout_rate: 0.1
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.1
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
  player_emb_dim: 32
  dense_units: 112
  dense_units_2: 112
  learning_rate: 0.0001
  dropout_rate: 0.2
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.4
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
2025-08-09 17:06:19.673614: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:06:19.673937: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
  player_emb_dim: 32
  dense_units: 80
  dense_units_2: 112
  learning_rate: 0.01
  dropout_rate: 0.2
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.4
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
2025-08-09 17:06:26.368972: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:06:26.369342: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
  player_emb_dim: 32
  dense_units: 112
  dense_units_2: 32
  learning_rate: 0.001
  dropout_rate: 0.4
  dropout_rate_2: 0.4
  dropout_rate_inter: 0.2
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
  player_emb_dim: 32
  dense_units: 112
  dense_units_2: 96
  learning_rate: 0.01
  dropout_rate: 0.2
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.2
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
2025-08-09 17:06:34.659878: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:06:34.660222: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
  player_emb_dim: 32
  dense_units: 128
  dense_units_2: 96
  learning_rate: 0.0001
  dropout_rate: 0.4
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.4
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
  player_emb_dim: 32
  dense_units: 16
  dense_units_2: 80
  learning_rate: 0.01
  dropout_rate: 0.30000000000000004
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.1
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
2025-08-09 17:06:45.523680: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:06:45.524033: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
  player_emb_dim: 32
  dense_units: 32
  dense_units_2: 48
  learning_rate: 0.01
  dropout_rate: 0.2
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
  player_emb_dim: 32
  dense_units: 16
  dense_units_2: 32
  learning_rate: 0.01
  dropout_rate: 0.2
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
2025-08-09 17:06:53.806376: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:06:53.806741: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
  player_emb_dim: 32
  dense_units: 128
  dense_units_2: 32
  learning_rate: 0.0001
  dropout_rate: 0.30000000000000004
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.4
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
  player_emb_dim: 32
  dense_units: 48
  dense_units_2: 48
  learning_rate: 0.01
  dropout_rate: 0.1
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
2025-08-09 17:07:03.558653: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:07:03.559013: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
  player_emb_dim: 32
  dense_units: 48
  dense_units_2: 48
  learning_rate: 0.01
  dropout_rate: 0.1
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 390ms/step - loss: 0.1290 - mean_absolute_error: 0.2664
1: 0.12901265919208527     2: 0.2663547992706299  
0.12901265919208527

FOLD 4
  player_emb_dim: 32
  dense_units: 16
  dense_units_2: 16
  learning_rate: 0.01
  dropout_rate: 0.1
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.1
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
  player_emb_dim: 32
  dense_units: 64
  dense_units_2: 128
  learning_rate: 0.01
  dropout_rate: 0.4
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
  player_emb_dim: 32
  dense_units: 112
  dense_units_2: 16
  learning_rate: 0.0001
  dropout_rate: 0.4
  dropout_rate_2: 0.4
  dropout_rate_inter: 0.1
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
2025-08-09 17:07:12.134003: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:07:12.134335: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
  player_emb_dim: 32
  dense_units: 96
  dense_units_2: 96
  learning_rate: 0.0001
  dropout_rate: 0.4
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.1
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
  player_emb_dim: 32
  dense_units: 48
  dense_units_2: 16
  learning_rate: 0.01
  dropout_rate: 0.1
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
2025-08-09 17:07:21.843030: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:07:21.843358: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
  player_emb_dim: 32
  dense_units: 80
  dense_units_2: 16
  learning_rate: 0.001
  dropout_rate: 0.1
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
  player_emb_dim: 32
  dense_units: 48
  dense_units_2: 112
  learning_rate: 0.0001
  dropout_rate: 0.1
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.4
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
2025-08-09 17:07:30.340808: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:07:30.341203: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
  player_emb_dim: 32
  dense_units: 32
  dense_units_2: 48
  learning_rate: 0.01
  dropout_rate: 0.1
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.1
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
2025-08-09 17:07:36.697014: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:07:36.697373: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
  player_emb_dim: 32
  dense_units: 64
  dense_units_2: 96
  learning_rate: 0.0001
  dropout_rate: 0.1
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.4
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
  player_emb_dim: 32
  dense_units: 64
  dense_units_2: 128
  learning_rate: 0.0001
  dropout_rate: 0.2
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.1
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
2025-08-09 17:07:47.734582: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:07:47.734912: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
  player_emb_dim: 32
  dense_units: 96
  dense_units_2: 16
  learning_rate: 0.0001
  dropout_rate: 0.2
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.4
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
2025-08-09 17:07:54.809326: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:07:54.809602: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
  player_emb_dim: 32
  dense_units: 64
  dense_units_2: 128
  learning_rate: 0.01
  dropout_rate: 0.4
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 408ms/step - loss: 0.1265 - mean_absolute_error: 0.2443
1: 0.12653082609176636     2: 0.24432477355003357  
0.12653082609176636

FOLD 5
  player_emb_dim: 32
  dense_units: 16
  dense_units_2: 16
  learning_rate: 0.01
  dropout_rate: 0.1
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.1
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
  player_emb_dim: 32
  dense_units: 128
  dense_units_2: 112
  learning_rate: 0.0001
  dropout_rate: 0.2
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.2
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
2025-08-09 17:07:59.944339: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:07:59.944676: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
  player_emb_dim: 32
  dense_units: 16
  dense_units_2: 128
  learning_rate: 0.0001
  dropout_rate: 0.1
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.1
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
2025-08-09 17:08:06.922667: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:08:06.923026: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
  player_emb_dim: 32
  dense_units: 16
  dense_units_2: 32
  learning_rate: 0.001
  dropout_rate: 0.1
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.1
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
2025-08-09 17:08:13.218002: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:08:13.218275: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
  player_emb_dim: 32
  dense_units: 16
  dense_units_2: 64
  learning_rate: 0.0001
  dropout_rate: 0.2
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.4
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
  player_emb_dim: 32
  dense_units: 64
  dense_units_2: 48
  learning_rate: 0.0001
  dropout_rate: 0.1
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.4
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
2025-08-09 17:08:24.597833: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:08:24.598151: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
  player_emb_dim: 32
  dense_units: 32
  dense_units_2: 96
  learning_rate: 0.001
  dropout_rate: 0.4
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
2025-08-09 17:08:31.136084: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:08:31.136442: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
  player_emb_dim: 32
  dense_units: 48
  dense_units_2: 80
  learning_rate: 0.001
  dropout_rate: 0.30000000000000004
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
2025-08-09 17:08:36.499836: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:08:36.500189: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
  player_emb_dim: 32
  dense_units: 64
  dense_units_2: 128
  learning_rate: 0.001
  dropout_rate: 0.30000000000000004
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.2
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 128)
  player_emb_dim: 32
  dense_units: 96
  dense_units_2: 32
  learning_rate: 0.001
  dropout_rate: 0.2
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.2
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 64)
2025-08-09 17:08:45.543652: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:08:45.544056: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 64)
  player_emb_dim: 32
  dense_units: 48
  dense_units_2: 112
  learning_rate: 0.001
  dropout_rate: 0.1
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (16, 96)
2025-08-09 17:08:51.033020: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:08:51.033633: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
  player_emb_dim: 32
  dense_units: 48
  dense_units_2: 112
  learning_rate: 0.001
  dropout_rate: 0.1
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
pairwise:pooled (None, 96)
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 401ms/step - loss: 0.2170 - mean_absolute_error: 0.3427
1: 0.21696588397026062     2: 0.3426567018032074  
0.21696588397026062

Best hyperparameters found:
player_emb_dim: 32
dense_units: 64
dense_units_2: 80
learning_rate: 0.001
dropout_rate: 0.1
dropout_rate_2: 0.30000000000000004
dropout_rate_inter: 0.1
interaction_scale: 4

Selecting best model#

model_inter, model_inter_train_loss = train_best_hps_model(all_best_hps_inter)
HPS: {'player_emb_dim': 32, 'dense_units': 48, 'dense_units_2': 16, 'learning_rate': 0.01, 'dropout_rate': 0.1, 'dropout_rate_2': 0.1, 'dropout_rate_inter': 0.2, 'interaction_scale': 2}. MSE during RandomSearch: 0.2271081507205963. Starting evaluation across all k folds...

FOLD 1
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 2s 96ms/step - loss: 0.9936 - mean_absolute_error: 0.7787 - val_loss: 1.5553 - val_mean_absolute_error: 1.0145
Epoch 2/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 12ms/step - loss: 0.7324 - mean_absolute_error: 0.6830
2025-08-09 17:08:56.233940: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:08:56.234293: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.7426 - mean_absolute_error: 0.6821 - val_loss: 0.8927 - val_mean_absolute_error: 0.7322
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.6204 - mean_absolute_error: 0.6416 - val_loss: 0.6289 - val_mean_absolute_error: 0.6757
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.5110 - mean_absolute_error: 0.5757 - val_loss: 0.5165 - val_mean_absolute_error: 0.5638
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.4302 - mean_absolute_error: 0.5351 - val_loss: 0.3774 - val_mean_absolute_error: 0.5039
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.3126 - mean_absolute_error: 0.4373 - val_loss: 0.3330 - val_mean_absolute_error: 0.4475
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.2103 - mean_absolute_error: 0.3425 - val_loss: 0.3369 - val_mean_absolute_error: 0.4452
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.2235 - mean_absolute_error: 0.3659 - val_loss: 0.2527 - val_mean_absolute_error: 0.4015
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1939 - mean_absolute_error: 0.3326 - val_loss: 0.2440 - val_mean_absolute_error: 0.3906
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1615 - mean_absolute_error: 0.2765 - val_loss: 0.2149 - val_mean_absolute_error: 0.3624
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1448 - mean_absolute_error: 0.2794 - val_loss: 0.2024 - val_mean_absolute_error: 0.3429
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1196 - mean_absolute_error: 0.2371 - val_loss: 0.2114 - val_mean_absolute_error: 0.3500
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1701 - mean_absolute_error: 0.2892 - val_loss: 0.1783 - val_mean_absolute_error: 0.3206
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1022 - mean_absolute_error: 0.2483 - val_loss: 0.1594 - val_mean_absolute_error: 0.3037
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1082 - mean_absolute_error: 0.2508 - val_loss: 0.2008 - val_mean_absolute_error: 0.3254
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1174 - mean_absolute_error: 0.2378 - val_loss: 0.1935 - val_mean_absolute_error: 0.3281
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1227 - mean_absolute_error: 0.2309 - val_loss: 0.1744 - val_mean_absolute_error: 0.3386
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1488 - mean_absolute_error: 0.2620 - val_loss: 0.1827 - val_mean_absolute_error: 0.3348
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1342 - mean_absolute_error: 0.2366 - val_loss: 0.1976 - val_mean_absolute_error: 0.3457
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1260 - mean_absolute_error: 0.2589 - val_loss: 0.1986 - val_mean_absolute_error: 0.3415
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0638 - mean_absolute_error: 0.1730 - val_loss: 0.2173 - val_mean_absolute_error: 0.3526
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0770 - mean_absolute_error: 0.2013 - val_loss: 0.1840 - val_mean_absolute_error: 0.3327
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0815 - mean_absolute_error: 0.2088 - val_loss: 0.1669 - val_mean_absolute_error: 0.3177
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0869 - mean_absolute_error: 0.2125 - val_loss: 0.2037 - val_mean_absolute_error: 0.3494
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1280 - mean_absolute_error: 0.2341 - val_loss: 0.1719 - val_mean_absolute_error: 0.3171
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1027 - mean_absolute_error: 0.2191 - val_loss: 0.1924 - val_mean_absolute_error: 0.3395
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0924 - mean_absolute_error: 0.2257 - val_loss: 0.1791 - val_mean_absolute_error: 0.3294
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0958 - mean_absolute_error: 0.2176 - val_loss: 0.2042 - val_mean_absolute_error: 0.3663
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1386 - mean_absolute_error: 0.2437 - val_loss: 0.1482 - val_mean_absolute_error: 0.2808
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1058 - mean_absolute_error: 0.2380 - val_loss: 0.1386 - val_mean_absolute_error: 0.2710
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0675 - mean_absolute_error: 0.1799 - val_loss: 0.1510 - val_mean_absolute_error: 0.2867
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0885 - mean_absolute_error: 0.1951 - val_loss: 0.1628 - val_mean_absolute_error: 0.3026
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0717 - mean_absolute_error: 0.1794 - val_loss: 0.1704 - val_mean_absolute_error: 0.3126
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1079 - mean_absolute_error: 0.1962 - val_loss: 0.1790 - val_mean_absolute_error: 0.3229
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0582 - mean_absolute_error: 0.1660 - val_loss: 0.1729 - val_mean_absolute_error: 0.3183
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0817 - mean_absolute_error: 0.1883 - val_loss: 0.1668 - val_mean_absolute_error: 0.3130
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0779 - mean_absolute_error: 0.1883 - val_loss: 0.1517 - val_mean_absolute_error: 0.2953
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0717 - mean_absolute_error: 0.1722 - val_loss: 0.1627 - val_mean_absolute_error: 0.3052
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1176 - mean_absolute_error: 0.2206 - val_loss: 0.1467 - val_mean_absolute_error: 0.3019
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0847 - mean_absolute_error: 0.2175 - val_loss: 0.1373 - val_mean_absolute_error: 0.2924
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0730 - mean_absolute_error: 0.1859 - val_loss: 0.1709 - val_mean_absolute_error: 0.3136
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1095 - mean_absolute_error: 0.2324 - val_loss: 0.1587 - val_mean_absolute_error: 0.3082
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1062 - mean_absolute_error: 0.2140 - val_loss: 0.1807 - val_mean_absolute_error: 0.3395
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0904 - mean_absolute_error: 0.1883 - val_loss: 0.1708 - val_mean_absolute_error: 0.3324
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0962 - mean_absolute_error: 0.2191 - val_loss: 0.1740 - val_mean_absolute_error: 0.3240
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0710 - mean_absolute_error: 0.1825 - val_loss: 0.1932 - val_mean_absolute_error: 0.3377
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1241 - mean_absolute_error: 0.2293 - val_loss: 0.1789 - val_mean_absolute_error: 0.3214
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1429 - mean_absolute_error: 0.2324 - val_loss: 0.2016 - val_mean_absolute_error: 0.3443
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0639 - mean_absolute_error: 0.1799 - val_loss: 0.1978 - val_mean_absolute_error: 0.3565
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0951 - mean_absolute_error: 0.2222 - val_loss: 0.1780 - val_mean_absolute_error: 0.3366

FOLD 2
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - loss: 0.0783 - mean_absolute_error: 0.1946 - val_loss: 0.0812 - val_mean_absolute_error: 0.2209
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0977 - mean_absolute_error: 0.2178 - val_loss: 0.0944 - val_mean_absolute_error: 0.2300
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1076 - mean_absolute_error: 0.2415 - val_loss: 0.1077 - val_mean_absolute_error: 0.2627
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0918 - mean_absolute_error: 0.2228 - val_loss: 0.1304 - val_mean_absolute_error: 0.2797
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1100 - mean_absolute_error: 0.2534 - val_loss: 0.1138 - val_mean_absolute_error: 0.2477
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1168 - mean_absolute_error: 0.2396 - val_loss: 0.0938 - val_mean_absolute_error: 0.2177
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0937 - mean_absolute_error: 0.2139 - val_loss: 0.0864 - val_mean_absolute_error: 0.2150
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0901 - mean_absolute_error: 0.2036 - val_loss: 0.1022 - val_mean_absolute_error: 0.2261
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0945 - mean_absolute_error: 0.2130 - val_loss: 0.1169 - val_mean_absolute_error: 0.2283
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0877 - mean_absolute_error: 0.2116 - val_loss: 0.1026 - val_mean_absolute_error: 0.2325
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0718 - mean_absolute_error: 0.1868 - val_loss: 0.0990 - val_mean_absolute_error: 0.2430
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0887 - mean_absolute_error: 0.2023 - val_loss: 0.1084 - val_mean_absolute_error: 0.2307
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0698 - mean_absolute_error: 0.1820 - val_loss: 0.1024 - val_mean_absolute_error: 0.2270
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0996 - mean_absolute_error: 0.2072 - val_loss: 0.1021 - val_mean_absolute_error: 0.2411
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0696 - mean_absolute_error: 0.1758 - val_loss: 0.1362 - val_mean_absolute_error: 0.2707
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0781 - mean_absolute_error: 0.1772 - val_loss: 0.1303 - val_mean_absolute_error: 0.2577
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0438 - mean_absolute_error: 0.1238 - val_loss: 0.1044 - val_mean_absolute_error: 0.2262
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0691 - mean_absolute_error: 0.1880 - val_loss: 0.1291 - val_mean_absolute_error: 0.2550
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0932 - mean_absolute_error: 0.1902 - val_loss: 0.1438 - val_mean_absolute_error: 0.2680
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1095 - mean_absolute_error: 0.2161 - val_loss: 0.1018 - val_mean_absolute_error: 0.2289
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0686 - mean_absolute_error: 0.1854 - val_loss: 0.1210 - val_mean_absolute_error: 0.2428
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0798 - mean_absolute_error: 0.1552 - val_loss: 0.1364 - val_mean_absolute_error: 0.2630
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0885 - mean_absolute_error: 0.1611 - val_loss: 0.0822 - val_mean_absolute_error: 0.1985
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0761 - mean_absolute_error: 0.1728 - val_loss: 0.0646 - val_mean_absolute_error: 0.1785
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0555 - mean_absolute_error: 0.1732 - val_loss: 0.1169 - val_mean_absolute_error: 0.2367
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0992 - mean_absolute_error: 0.2092 - val_loss: 0.1352 - val_mean_absolute_error: 0.2523
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0653 - mean_absolute_error: 0.1713 - val_loss: 0.0923 - val_mean_absolute_error: 0.2104
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0953 - mean_absolute_error: 0.2104 - val_loss: 0.1327 - val_mean_absolute_error: 0.2678
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0539 - mean_absolute_error: 0.1514 - val_loss: 0.1916 - val_mean_absolute_error: 0.3305
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0808 - mean_absolute_error: 0.2038 - val_loss: 0.1215 - val_mean_absolute_error: 0.2544
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0747 - mean_absolute_error: 0.1975 - val_loss: 0.1240 - val_mean_absolute_error: 0.2299
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0959 - mean_absolute_error: 0.1939 - val_loss: 0.1807 - val_mean_absolute_error: 0.2811
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0637 - mean_absolute_error: 0.1618 - val_loss: 0.1309 - val_mean_absolute_error: 0.2441
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0690 - mean_absolute_error: 0.1541 - val_loss: 0.1242 - val_mean_absolute_error: 0.2433
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0414 - mean_absolute_error: 0.1444 - val_loss: 0.1540 - val_mean_absolute_error: 0.2470
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0630 - mean_absolute_error: 0.1651 - val_loss: 0.1313 - val_mean_absolute_error: 0.2216
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0706 - mean_absolute_error: 0.1725 - val_loss: 0.1011 - val_mean_absolute_error: 0.2130
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0580 - mean_absolute_error: 0.1666 - val_loss: 0.0987 - val_mean_absolute_error: 0.2091
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0752 - mean_absolute_error: 0.1723 - val_loss: 0.1069 - val_mean_absolute_error: 0.2281
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0953 - mean_absolute_error: 0.1910 - val_loss: 0.1165 - val_mean_absolute_error: 0.2369
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0614 - mean_absolute_error: 0.1635 - val_loss: 0.1762 - val_mean_absolute_error: 0.3083
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0933 - mean_absolute_error: 0.1970 - val_loss: 0.1616 - val_mean_absolute_error: 0.2956
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0579 - mean_absolute_error: 0.1809 - val_loss: 0.1287 - val_mean_absolute_error: 0.2484
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0692 - mean_absolute_error: 0.1690 - val_loss: 0.1191 - val_mean_absolute_error: 0.2360
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0372 - mean_absolute_error: 0.1365 - val_loss: 0.1563 - val_mean_absolute_error: 0.2658
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0628 - mean_absolute_error: 0.1444 - val_loss: 0.1479 - val_mean_absolute_error: 0.2779
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0582 - mean_absolute_error: 0.1678 - val_loss: 0.0992 - val_mean_absolute_error: 0.2344
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0707 - mean_absolute_error: 0.1779 - val_loss: 0.1035 - val_mean_absolute_error: 0.2214
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0387 - mean_absolute_error: 0.1286 - val_loss: 0.1305 - val_mean_absolute_error: 0.2481
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0562 - mean_absolute_error: 0.1587 - val_loss: 0.1080 - val_mean_absolute_error: 0.2117

FOLD 3
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - loss: 0.0695 - mean_absolute_error: 0.1862 - val_loss: 0.0421 - val_mean_absolute_error: 0.1331
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0837 - mean_absolute_error: 0.1877 - val_loss: 0.0184 - val_mean_absolute_error: 0.0762
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0749 - mean_absolute_error: 0.1841 - val_loss: 0.0386 - val_mean_absolute_error: 0.1318
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1686 - mean_absolute_error: 0.2882 - val_loss: 0.0399 - val_mean_absolute_error: 0.1419
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0795 - mean_absolute_error: 0.1825 - val_loss: 0.0590 - val_mean_absolute_error: 0.1723
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0696 - mean_absolute_error: 0.1958 - val_loss: 0.0279 - val_mean_absolute_error: 0.1187
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0793 - mean_absolute_error: 0.1849 - val_loss: 0.0247 - val_mean_absolute_error: 0.0956
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0548 - mean_absolute_error: 0.1671 - val_loss: 0.0389 - val_mean_absolute_error: 0.1300
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0877 - mean_absolute_error: 0.1824 - val_loss: 0.0365 - val_mean_absolute_error: 0.1246
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0665 - mean_absolute_error: 0.1791 - val_loss: 0.0521 - val_mean_absolute_error: 0.1605
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0930 - mean_absolute_error: 0.2127 - val_loss: 0.0556 - val_mean_absolute_error: 0.1628
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0775 - mean_absolute_error: 0.1766 - val_loss: 0.0751 - val_mean_absolute_error: 0.1903
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0511 - mean_absolute_error: 0.1502 - val_loss: 0.0656 - val_mean_absolute_error: 0.1692
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0655 - mean_absolute_error: 0.1737 - val_loss: 0.0735 - val_mean_absolute_error: 0.1775
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1265 - mean_absolute_error: 0.1987 - val_loss: 0.0494 - val_mean_absolute_error: 0.1380
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1066 - mean_absolute_error: 0.2229 - val_loss: 0.0501 - val_mean_absolute_error: 0.1551
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0847 - mean_absolute_error: 0.2059 - val_loss: 0.0521 - val_mean_absolute_error: 0.1732
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0586 - mean_absolute_error: 0.1504 - val_loss: 0.0853 - val_mean_absolute_error: 0.2295
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1019 - mean_absolute_error: 0.2035 - val_loss: 0.0478 - val_mean_absolute_error: 0.1560
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0595 - mean_absolute_error: 0.1802 - val_loss: 0.0876 - val_mean_absolute_error: 0.2012
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0882 - mean_absolute_error: 0.2015 - val_loss: 0.0722 - val_mean_absolute_error: 0.1898
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0517 - mean_absolute_error: 0.1550 - val_loss: 0.0816 - val_mean_absolute_error: 0.2059
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0851 - mean_absolute_error: 0.1958 - val_loss: 0.0498 - val_mean_absolute_error: 0.1672
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0790 - mean_absolute_error: 0.2075 - val_loss: 0.0355 - val_mean_absolute_error: 0.1248
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1135 - mean_absolute_error: 0.2520 - val_loss: 0.0418 - val_mean_absolute_error: 0.1496
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0617 - mean_absolute_error: 0.1672 - val_loss: 0.0885 - val_mean_absolute_error: 0.2178
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0881 - mean_absolute_error: 0.1946 - val_loss: 0.0823 - val_mean_absolute_error: 0.2032
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0601 - mean_absolute_error: 0.1757 - val_loss: 0.0638 - val_mean_absolute_error: 0.1792
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0614 - mean_absolute_error: 0.1766 - val_loss: 0.0780 - val_mean_absolute_error: 0.2046
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0670 - mean_absolute_error: 0.1677 - val_loss: 0.0785 - val_mean_absolute_error: 0.2000
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0540 - mean_absolute_error: 0.1490 - val_loss: 0.0697 - val_mean_absolute_error: 0.1816
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0730 - mean_absolute_error: 0.1691 - val_loss: 0.0614 - val_mean_absolute_error: 0.1733
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0547 - mean_absolute_error: 0.1692 - val_loss: 0.0475 - val_mean_absolute_error: 0.1504
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0761 - mean_absolute_error: 0.1840 - val_loss: 0.0666 - val_mean_absolute_error: 0.1818
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0884 - mean_absolute_error: 0.1806 - val_loss: 0.0679 - val_mean_absolute_error: 0.1757
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0664 - mean_absolute_error: 0.1703 - val_loss: 0.0926 - val_mean_absolute_error: 0.2298
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0806 - mean_absolute_error: 0.2176 - val_loss: 0.0940 - val_mean_absolute_error: 0.2139
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0767 - mean_absolute_error: 0.1932 - val_loss: 0.1095 - val_mean_absolute_error: 0.2437
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0995 - mean_absolute_error: 0.2188 - val_loss: 0.0778 - val_mean_absolute_error: 0.2059
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0729 - mean_absolute_error: 0.1786 - val_loss: 0.0550 - val_mean_absolute_error: 0.1678
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0593 - mean_absolute_error: 0.1737 - val_loss: 0.0628 - val_mean_absolute_error: 0.1966
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0457 - mean_absolute_error: 0.1311 - val_loss: 0.1294 - val_mean_absolute_error: 0.2745
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1529 - mean_absolute_error: 0.2492 - val_loss: 0.0611 - val_mean_absolute_error: 0.1886
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0711 - mean_absolute_error: 0.1601 - val_loss: 0.0616 - val_mean_absolute_error: 0.1929
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0981 - mean_absolute_error: 0.2055 - val_loss: 0.0750 - val_mean_absolute_error: 0.2088
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0641 - mean_absolute_error: 0.1670 - val_loss: 0.0826 - val_mean_absolute_error: 0.2212
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0417 - mean_absolute_error: 0.1382 - val_loss: 0.0763 - val_mean_absolute_error: 0.2200
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0643 - mean_absolute_error: 0.1770 - val_loss: 0.0516 - val_mean_absolute_error: 0.1707
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0826 - mean_absolute_error: 0.1922 - val_loss: 0.0541 - val_mean_absolute_error: 0.1710
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0757 - mean_absolute_error: 0.1842 - val_loss: 0.0673 - val_mean_absolute_error: 0.1976

FOLD 4
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - loss: 0.0880 - mean_absolute_error: 0.1889 - val_loss: 0.0138 - val_mean_absolute_error: 0.0637
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0482 - mean_absolute_error: 0.1537 - val_loss: 0.0129 - val_mean_absolute_error: 0.0547
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0548 - mean_absolute_error: 0.1665 - val_loss: 0.0164 - val_mean_absolute_error: 0.0750
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0744 - mean_absolute_error: 0.1881 - val_loss: 0.0176 - val_mean_absolute_error: 0.0769
Epoch 5/50
2025-08-09 17:09:03.087006: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:09:03.087361: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0496 - mean_absolute_error: 0.1431 - val_loss: 0.0145 - val_mean_absolute_error: 0.0708
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0590 - mean_absolute_error: 0.1680 - val_loss: 0.0154 - val_mean_absolute_error: 0.0810
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0469 - mean_absolute_error: 0.1668 - val_loss: 0.0193 - val_mean_absolute_error: 0.0947
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0604 - mean_absolute_error: 0.1642 - val_loss: 0.0217 - val_mean_absolute_error: 0.0990
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0572 - mean_absolute_error: 0.1698 - val_loss: 0.0221 - val_mean_absolute_error: 0.0995
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0993 - mean_absolute_error: 0.2224 - val_loss: 0.0201 - val_mean_absolute_error: 0.0943
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0786 - mean_absolute_error: 0.1860 - val_loss: 0.0279 - val_mean_absolute_error: 0.1206
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1023 - mean_absolute_error: 0.1869 - val_loss: 0.0162 - val_mean_absolute_error: 0.0770
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1063 - mean_absolute_error: 0.2067 - val_loss: 0.0166 - val_mean_absolute_error: 0.0711
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0804 - mean_absolute_error: 0.2046 - val_loss: 0.0230 - val_mean_absolute_error: 0.1022
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0883 - mean_absolute_error: 0.1937 - val_loss: 0.0173 - val_mean_absolute_error: 0.0767
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0678 - mean_absolute_error: 0.1724 - val_loss: 0.0217 - val_mean_absolute_error: 0.0967
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0827 - mean_absolute_error: 0.1791 - val_loss: 0.0439 - val_mean_absolute_error: 0.1523
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0693 - mean_absolute_error: 0.1810 - val_loss: 0.0184 - val_mean_absolute_error: 0.0904
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0492 - mean_absolute_error: 0.1465 - val_loss: 0.0134 - val_mean_absolute_error: 0.0645
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0741 - mean_absolute_error: 0.1789 - val_loss: 0.0272 - val_mean_absolute_error: 0.1143
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0588 - mean_absolute_error: 0.1612 - val_loss: 0.0349 - val_mean_absolute_error: 0.1347
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0806 - mean_absolute_error: 0.1902 - val_loss: 0.0176 - val_mean_absolute_error: 0.0915
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0916 - mean_absolute_error: 0.1950 - val_loss: 0.0216 - val_mean_absolute_error: 0.1051
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0989 - mean_absolute_error: 0.2151 - val_loss: 0.0315 - val_mean_absolute_error: 0.1223
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0576 - mean_absolute_error: 0.1593 - val_loss: 0.0411 - val_mean_absolute_error: 0.1507
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0929 - mean_absolute_error: 0.1813 - val_loss: 0.0301 - val_mean_absolute_error: 0.1237
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0783 - mean_absolute_error: 0.1934 - val_loss: 0.0267 - val_mean_absolute_error: 0.1026
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0885 - mean_absolute_error: 0.1964 - val_loss: 0.0279 - val_mean_absolute_error: 0.1030
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0617 - mean_absolute_error: 0.1668 - val_loss: 0.0485 - val_mean_absolute_error: 0.1704
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0916 - mean_absolute_error: 0.1938 - val_loss: 0.0245 - val_mean_absolute_error: 0.0974
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0631 - mean_absolute_error: 0.1560 - val_loss: 0.0310 - val_mean_absolute_error: 0.1291
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0768 - mean_absolute_error: 0.2137 - val_loss: 0.0495 - val_mean_absolute_error: 0.1670
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1124 - mean_absolute_error: 0.2509 - val_loss: 0.0483 - val_mean_absolute_error: 0.1648
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0722 - mean_absolute_error: 0.1859 - val_loss: 0.0518 - val_mean_absolute_error: 0.1622
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0947 - mean_absolute_error: 0.1960 - val_loss: 0.0246 - val_mean_absolute_error: 0.1026
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0547 - mean_absolute_error: 0.1621 - val_loss: 0.0612 - val_mean_absolute_error: 0.1914
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1025 - mean_absolute_error: 0.2279 - val_loss: 0.0523 - val_mean_absolute_error: 0.1677
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0960 - mean_absolute_error: 0.2088 - val_loss: 0.0639 - val_mean_absolute_error: 0.2063
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1097 - mean_absolute_error: 0.2493 - val_loss: 0.0471 - val_mean_absolute_error: 0.1515
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1179 - mean_absolute_error: 0.2587 - val_loss: 0.0508 - val_mean_absolute_error: 0.1829
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0890 - mean_absolute_error: 0.1857 - val_loss: 0.0519 - val_mean_absolute_error: 0.1504
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0936 - mean_absolute_error: 0.2151 - val_loss: 0.0357 - val_mean_absolute_error: 0.1388
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0596 - mean_absolute_error: 0.1809 - val_loss: 0.0302 - val_mean_absolute_error: 0.1327
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0885 - mean_absolute_error: 0.2181 - val_loss: 0.0401 - val_mean_absolute_error: 0.1520
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0723 - mean_absolute_error: 0.1798 - val_loss: 0.0553 - val_mean_absolute_error: 0.1773
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0748 - mean_absolute_error: 0.1776 - val_loss: 0.0311 - val_mean_absolute_error: 0.1326
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0660 - mean_absolute_error: 0.1717 - val_loss: 0.0388 - val_mean_absolute_error: 0.1505
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0728 - mean_absolute_error: 0.1772 - val_loss: 0.0368 - val_mean_absolute_error: 0.1425
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0645 - mean_absolute_error: 0.1837 - val_loss: 0.0560 - val_mean_absolute_error: 0.1803
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0920 - mean_absolute_error: 0.1825 - val_loss: 0.0359 - val_mean_absolute_error: 0.1350

FOLD 5
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - loss: 0.0445 - mean_absolute_error: 0.1462 - val_loss: 0.0139 - val_mean_absolute_error: 0.0738
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0705 - mean_absolute_error: 0.1807 - val_loss: 0.0200 - val_mean_absolute_error: 0.0845
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0406 - mean_absolute_error: 0.1223 - val_loss: 0.0257 - val_mean_absolute_error: 0.1056
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0414 - mean_absolute_error: 0.1276 - val_loss: 0.0234 - val_mean_absolute_error: 0.1002
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0448 - mean_absolute_error: 0.1339 - val_loss: 0.0150 - val_mean_absolute_error: 0.0730
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0335 - mean_absolute_error: 0.1198 - val_loss: 0.0228 - val_mean_absolute_error: 0.0954
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0564 - mean_absolute_error: 0.1639 - val_loss: 0.0188 - val_mean_absolute_error: 0.0870
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0397 - mean_absolute_error: 0.1254 - val_loss: 0.0227 - val_mean_absolute_error: 0.1029
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0671 - mean_absolute_error: 0.1795 - val_loss: 0.0256 - val_mean_absolute_error: 0.1122
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0542 - mean_absolute_error: 0.1599 - val_loss: 0.0287 - val_mean_absolute_error: 0.1198
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0561 - mean_absolute_error: 0.1549 - val_loss: 0.0153 - val_mean_absolute_error: 0.0759
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0661 - mean_absolute_error: 0.1735 - val_loss: 0.0158 - val_mean_absolute_error: 0.0754
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0699 - mean_absolute_error: 0.1684 - val_loss: 0.0213 - val_mean_absolute_error: 0.0909
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0302 - mean_absolute_error: 0.1116 - val_loss: 0.0242 - val_mean_absolute_error: 0.0938
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0408 - mean_absolute_error: 0.1381 - val_loss: 0.0281 - val_mean_absolute_error: 0.1165
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0341 - mean_absolute_error: 0.1105 - val_loss: 0.0420 - val_mean_absolute_error: 0.1486
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0410 - mean_absolute_error: 0.1194 - val_loss: 0.0352 - val_mean_absolute_error: 0.1305
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0384 - mean_absolute_error: 0.1187 - val_loss: 0.0207 - val_mean_absolute_error: 0.0947
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0313 - mean_absolute_error: 0.1226 - val_loss: 0.0209 - val_mean_absolute_error: 0.0987
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0286 - mean_absolute_error: 0.1161 - val_loss: 0.0237 - val_mean_absolute_error: 0.1088
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0701 - mean_absolute_error: 0.1633 - val_loss: 0.0210 - val_mean_absolute_error: 0.0899
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0618 - mean_absolute_error: 0.1611 - val_loss: 0.0207 - val_mean_absolute_error: 0.0873
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0384 - mean_absolute_error: 0.1311 - val_loss: 0.0265 - val_mean_absolute_error: 0.1183
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0720 - mean_absolute_error: 0.1640 - val_loss: 0.0421 - val_mean_absolute_error: 0.1580
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0773 - mean_absolute_error: 0.1550 - val_loss: 0.0205 - val_mean_absolute_error: 0.1009
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0709 - mean_absolute_error: 0.1636 - val_loss: 0.0173 - val_mean_absolute_error: 0.0880
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0362 - mean_absolute_error: 0.1223 - val_loss: 0.0368 - val_mean_absolute_error: 0.1448
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0378 - mean_absolute_error: 0.1209 - val_loss: 0.0485 - val_mean_absolute_error: 0.1690
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0584 - mean_absolute_error: 0.1528 - val_loss: 0.0287 - val_mean_absolute_error: 0.1324
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0315 - mean_absolute_error: 0.1243 - val_loss: 0.0237 - val_mean_absolute_error: 0.1121
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0461 - mean_absolute_error: 0.1375 - val_loss: 0.0304 - val_mean_absolute_error: 0.1228
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0506 - mean_absolute_error: 0.1341 - val_loss: 0.0449 - val_mean_absolute_error: 0.1541
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0359 - mean_absolute_error: 0.1090 - val_loss: 0.0349 - val_mean_absolute_error: 0.1285
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0566 - mean_absolute_error: 0.1427 - val_loss: 0.0222 - val_mean_absolute_error: 0.1067
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0659 - mean_absolute_error: 0.1792 - val_loss: 0.0317 - val_mean_absolute_error: 0.1281
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0415 - mean_absolute_error: 0.1246 - val_loss: 0.0600 - val_mean_absolute_error: 0.1925
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0480 - mean_absolute_error: 0.1457 - val_loss: 0.0348 - val_mean_absolute_error: 0.1423
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0389 - mean_absolute_error: 0.1350 - val_loss: 0.0288 - val_mean_absolute_error: 0.1345
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0496 - mean_absolute_error: 0.1525 - val_loss: 0.0289 - val_mean_absolute_error: 0.1313
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0274 - mean_absolute_error: 0.1160 - val_loss: 0.0366 - val_mean_absolute_error: 0.1401
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0351 - mean_absolute_error: 0.1269 - val_loss: 0.0432 - val_mean_absolute_error: 0.1555
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0812 - mean_absolute_error: 0.1322 - val_loss: 0.0398 - val_mean_absolute_error: 0.1530
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0277 - mean_absolute_error: 0.1076 - val_loss: 0.0346 - val_mean_absolute_error: 0.1319
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0423 - mean_absolute_error: 0.1336 - val_loss: 0.0430 - val_mean_absolute_error: 0.1569
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0458 - mean_absolute_error: 0.1400 - val_loss: 0.0295 - val_mean_absolute_error: 0.1321
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0342 - mean_absolute_error: 0.1258 - val_loss: 0.0285 - val_mean_absolute_error: 0.1296
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0339 - mean_absolute_error: 0.1258 - val_loss: 0.0277 - val_mean_absolute_error: 0.1308
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0510 - mean_absolute_error: 0.1423 - val_loss: 0.0314 - val_mean_absolute_error: 0.1290
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0285 - mean_absolute_error: 0.1142 - val_loss: 0.0382 - val_mean_absolute_error: 0.1491
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0469 - mean_absolute_error: 0.1256 - val_loss: 0.0433 - val_mean_absolute_error: 0.1592
Validation losses: [0.17804540693759918, 0.10796533524990082, 0.06733987480401993, 0.03587379679083824, 0.0432770811021328]
HPS: {'player_emb_dim': 32, 'dense_units': 64, 'dense_units_2': 80, 'learning_rate': 0.001, 'dropout_rate': 0.1, 'dropout_rate_2': 0.30000000000000004, 'dropout_rate_inter': 0.1, 'interaction_scale': 4}. MSE during RandomSearch: 0.2380724847316742. Starting evaluation across all k folds...

FOLD 1
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 2s 98ms/step - loss: 1.2993 - mean_absolute_error: 0.8944 - val_loss: 1.1134 - val_mean_absolute_error: 0.8304
Epoch 2/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.7987 - mean_absolute_error: 0.6863
2025-08-09 17:09:09.183926: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:09:09.184286: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.9415 - mean_absolute_error: 0.7526 - val_loss: 1.1433 - val_mean_absolute_error: 0.8550
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.8708 - mean_absolute_error: 0.7532 - val_loss: 1.0890 - val_mean_absolute_error: 0.8279
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.6392 - mean_absolute_error: 0.6324 - val_loss: 1.0195 - val_mean_absolute_error: 0.7853
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.5420 - mean_absolute_error: 0.5910 - val_loss: 0.9674 - val_mean_absolute_error: 0.7551
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.4623 - mean_absolute_error: 0.5552 - val_loss: 0.8912 - val_mean_absolute_error: 0.6998
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.4521 - mean_absolute_error: 0.5302 - val_loss: 0.7241 - val_mean_absolute_error: 0.6154
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.3099 - mean_absolute_error: 0.4425 - val_loss: 0.6436 - val_mean_absolute_error: 0.5843
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.2777 - mean_absolute_error: 0.4035 - val_loss: 0.6148 - val_mean_absolute_error: 0.5736
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.2194 - mean_absolute_error: 0.3639 - val_loss: 0.5518 - val_mean_absolute_error: 0.5479
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1884 - mean_absolute_error: 0.3216 - val_loss: 0.4473 - val_mean_absolute_error: 0.4969
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.2098 - mean_absolute_error: 0.3528 - val_loss: 0.4352 - val_mean_absolute_error: 0.4931
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1500 - mean_absolute_error: 0.2965 - val_loss: 0.4155 - val_mean_absolute_error: 0.4888
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1635 - mean_absolute_error: 0.3182 - val_loss: 0.4168 - val_mean_absolute_error: 0.4909
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1125 - mean_absolute_error: 0.2579 - val_loss: 0.4529 - val_mean_absolute_error: 0.5086
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1293 - mean_absolute_error: 0.2627 - val_loss: 0.4997 - val_mean_absolute_error: 0.5294
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1319 - mean_absolute_error: 0.2740 - val_loss: 0.4448 - val_mean_absolute_error: 0.4949
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1103 - mean_absolute_error: 0.2396 - val_loss: 0.4162 - val_mean_absolute_error: 0.4944
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0870 - mean_absolute_error: 0.2120 - val_loss: 0.4319 - val_mean_absolute_error: 0.4925
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1029 - mean_absolute_error: 0.2394 - val_loss: 0.4536 - val_mean_absolute_error: 0.4996
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1018 - mean_absolute_error: 0.2214 - val_loss: 0.4085 - val_mean_absolute_error: 0.4738
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0730 - mean_absolute_error: 0.2021 - val_loss: 0.3692 - val_mean_absolute_error: 0.4549
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0936 - mean_absolute_error: 0.2171 - val_loss: 0.3644 - val_mean_absolute_error: 0.4477
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0911 - mean_absolute_error: 0.2104 - val_loss: 0.3797 - val_mean_absolute_error: 0.4622
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0695 - mean_absolute_error: 0.1815 - val_loss: 0.3754 - val_mean_absolute_error: 0.4598
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0614 - mean_absolute_error: 0.1653 - val_loss: 0.3695 - val_mean_absolute_error: 0.4535
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0677 - mean_absolute_error: 0.1646 - val_loss: 0.3727 - val_mean_absolute_error: 0.4504
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0763 - mean_absolute_error: 0.2018 - val_loss: 0.3573 - val_mean_absolute_error: 0.4400
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0679 - mean_absolute_error: 0.1754 - val_loss: 0.3424 - val_mean_absolute_error: 0.4366
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0736 - mean_absolute_error: 0.1745 - val_loss: 0.3846 - val_mean_absolute_error: 0.4577
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0731 - mean_absolute_error: 0.1997 - val_loss: 0.3998 - val_mean_absolute_error: 0.4675
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0740 - mean_absolute_error: 0.1832 - val_loss: 0.3446 - val_mean_absolute_error: 0.4464
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0932 - mean_absolute_error: 0.2052 - val_loss: 0.3358 - val_mean_absolute_error: 0.4411
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0623 - mean_absolute_error: 0.1699 - val_loss: 0.3407 - val_mean_absolute_error: 0.4384
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0861 - mean_absolute_error: 0.1994 - val_loss: 0.3589 - val_mean_absolute_error: 0.4463
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0693 - mean_absolute_error: 0.1625 - val_loss: 0.3527 - val_mean_absolute_error: 0.4411
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0679 - mean_absolute_error: 0.1775 - val_loss: 0.3477 - val_mean_absolute_error: 0.4343
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0962 - mean_absolute_error: 0.2028 - val_loss: 0.3506 - val_mean_absolute_error: 0.4333
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0595 - mean_absolute_error: 0.1635 - val_loss: 0.3166 - val_mean_absolute_error: 0.4114
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0662 - mean_absolute_error: 0.1589 - val_loss: 0.3189 - val_mean_absolute_error: 0.4092
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0640 - mean_absolute_error: 0.1732 - val_loss: 0.3497 - val_mean_absolute_error: 0.4329
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0685 - mean_absolute_error: 0.1639 - val_loss: 0.3510 - val_mean_absolute_error: 0.4411
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0696 - mean_absolute_error: 0.1878 - val_loss: 0.3122 - val_mean_absolute_error: 0.4144
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0581 - mean_absolute_error: 0.1584 - val_loss: 0.3056 - val_mean_absolute_error: 0.4141
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0561 - mean_absolute_error: 0.1623 - val_loss: 0.3240 - val_mean_absolute_error: 0.4216
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0556 - mean_absolute_error: 0.1600 - val_loss: 0.3251 - val_mean_absolute_error: 0.4223
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0501 - mean_absolute_error: 0.1429 - val_loss: 0.2975 - val_mean_absolute_error: 0.4071
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0654 - mean_absolute_error: 0.1756 - val_loss: 0.3097 - val_mean_absolute_error: 0.4185
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0439 - mean_absolute_error: 0.1367 - val_loss: 0.3325 - val_mean_absolute_error: 0.4328
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0420 - mean_absolute_error: 0.1268 - val_loss: 0.3190 - val_mean_absolute_error: 0.4307

FOLD 2
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - loss: 0.1658 - mean_absolute_error: 0.2642 - val_loss: 0.0310 - val_mean_absolute_error: 0.0959
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1269 - mean_absolute_error: 0.2382 - val_loss: 0.0363 - val_mean_absolute_error: 0.1151
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0692 - mean_absolute_error: 0.1729 - val_loss: 0.0441 - val_mean_absolute_error: 0.1332
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0845 - mean_absolute_error: 0.2041 - val_loss: 0.0468 - val_mean_absolute_error: 0.1231
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0776 - mean_absolute_error: 0.1919 - val_loss: 0.0526 - val_mean_absolute_error: 0.1517
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0745 - mean_absolute_error: 0.1949 - val_loss: 0.0569 - val_mean_absolute_error: 0.1573
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0691 - mean_absolute_error: 0.1748 - val_loss: 0.0548 - val_mean_absolute_error: 0.1530
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0609 - mean_absolute_error: 0.1664 - val_loss: 0.0537 - val_mean_absolute_error: 0.1540
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0794 - mean_absolute_error: 0.1919 - val_loss: 0.0557 - val_mean_absolute_error: 0.1503
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0662 - mean_absolute_error: 0.1747 - val_loss: 0.0533 - val_mean_absolute_error: 0.1575
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0526 - mean_absolute_error: 0.1554 - val_loss: 0.0544 - val_mean_absolute_error: 0.1761
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0749 - mean_absolute_error: 0.1793 - val_loss: 0.0520 - val_mean_absolute_error: 0.1576
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0611 - mean_absolute_error: 0.1573 - val_loss: 0.0548 - val_mean_absolute_error: 0.1647
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0593 - mean_absolute_error: 0.1498 - val_loss: 0.0659 - val_mean_absolute_error: 0.1926
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0517 - mean_absolute_error: 0.1591 - val_loss: 0.0593 - val_mean_absolute_error: 0.1680
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0432 - mean_absolute_error: 0.1282 - val_loss: 0.0644 - val_mean_absolute_error: 0.1540
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0528 - mean_absolute_error: 0.1629 - val_loss: 0.0558 - val_mean_absolute_error: 0.1720
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0437 - mean_absolute_error: 0.1280 - val_loss: 0.0645 - val_mean_absolute_error: 0.2022
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0560 - mean_absolute_error: 0.1616 - val_loss: 0.0552 - val_mean_absolute_error: 0.1608
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0581 - mean_absolute_error: 0.1660 - val_loss: 0.0626 - val_mean_absolute_error: 0.1704
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0551 - mean_absolute_error: 0.1595 - val_loss: 0.0650 - val_mean_absolute_error: 0.1943
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0631 - mean_absolute_error: 0.1669 - val_loss: 0.0770 - val_mean_absolute_error: 0.2205
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0430 - mean_absolute_error: 0.1403 - val_loss: 0.0535 - val_mean_absolute_error: 0.1598
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0544 - mean_absolute_error: 0.1585 - val_loss: 0.0554 - val_mean_absolute_error: 0.1577
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0493 - mean_absolute_error: 0.1531 - val_loss: 0.0504 - val_mean_absolute_error: 0.1502
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0530 - mean_absolute_error: 0.1469 - val_loss: 0.0650 - val_mean_absolute_error: 0.1966
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0528 - mean_absolute_error: 0.1462 - val_loss: 0.0544 - val_mean_absolute_error: 0.1640
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0436 - mean_absolute_error: 0.1306 - val_loss: 0.0550 - val_mean_absolute_error: 0.1623
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0386 - mean_absolute_error: 0.1222 - val_loss: 0.0651 - val_mean_absolute_error: 0.1969
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0522 - mean_absolute_error: 0.1543 - val_loss: 0.0691 - val_mean_absolute_error: 0.2097
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0551 - mean_absolute_error: 0.1631 - val_loss: 0.0564 - val_mean_absolute_error: 0.1748
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0527 - mean_absolute_error: 0.1337 - val_loss: 0.0578 - val_mean_absolute_error: 0.1560
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0554 - mean_absolute_error: 0.1575 - val_loss: 0.0618 - val_mean_absolute_error: 0.1764
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0553 - mean_absolute_error: 0.1418 - val_loss: 0.0608 - val_mean_absolute_error: 0.1691
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0520 - mean_absolute_error: 0.1431 - val_loss: 0.0629 - val_mean_absolute_error: 0.1672
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0579 - mean_absolute_error: 0.1553 - val_loss: 0.0626 - val_mean_absolute_error: 0.1697
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0454 - mean_absolute_error: 0.1296 - val_loss: 0.0656 - val_mean_absolute_error: 0.1817
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0777 - mean_absolute_error: 0.1683 - val_loss: 0.0675 - val_mean_absolute_error: 0.1877
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0458 - mean_absolute_error: 0.1377 - val_loss: 0.0673 - val_mean_absolute_error: 0.1801
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0503 - mean_absolute_error: 0.1465 - val_loss: 0.0677 - val_mean_absolute_error: 0.1880
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0385 - mean_absolute_error: 0.1183 - val_loss: 0.0662 - val_mean_absolute_error: 0.1860
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0445 - mean_absolute_error: 0.1290 - val_loss: 0.0609 - val_mean_absolute_error: 0.1752
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0470 - mean_absolute_error: 0.1288 - val_loss: 0.0594 - val_mean_absolute_error: 0.1736
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0416 - mean_absolute_error: 0.1226 - val_loss: 0.0591 - val_mean_absolute_error: 0.1743
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0376 - mean_absolute_error: 0.1194 - val_loss: 0.0594 - val_mean_absolute_error: 0.1757
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0370 - mean_absolute_error: 0.1167 - val_loss: 0.0599 - val_mean_absolute_error: 0.1677
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0530 - mean_absolute_error: 0.1538 - val_loss: 0.0623 - val_mean_absolute_error: 0.1800
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0448 - mean_absolute_error: 0.1273 - val_loss: 0.0645 - val_mean_absolute_error: 0.1846
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0413 - mean_absolute_error: 0.1328 - val_loss: 0.0622 - val_mean_absolute_error: 0.1729
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0425 - mean_absolute_error: 0.1247 - val_loss: 0.0615 - val_mean_absolute_error: 0.1706

FOLD 3
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - loss: 0.0483 - mean_absolute_error: 0.1414 - val_loss: 0.0190 - val_mean_absolute_error: 0.0609
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0489 - mean_absolute_error: 0.1346 - val_loss: 0.0220 - val_mean_absolute_error: 0.0743
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.0537 - mean_absolute_error: 0.1554 - val_loss: 0.0245 - val_mean_absolute_error: 0.0834
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0481 - mean_absolute_error: 0.1454 - val_loss: 0.0299 - val_mean_absolute_error: 0.1091
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0562 - mean_absolute_error: 0.1442 - val_loss: 0.0380 - val_mean_absolute_error: 0.1282
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0629 - mean_absolute_error: 0.1766 - val_loss: 0.0291 - val_mean_absolute_error: 0.0943
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0435 - mean_absolute_error: 0.1316 - val_loss: 0.0252 - val_mean_absolute_error: 0.0756
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0574 - mean_absolute_error: 0.1499 - val_loss: 0.0272 - val_mean_absolute_error: 0.0746
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0435 - mean_absolute_error: 0.1335 - val_loss: 0.0299 - val_mean_absolute_error: 0.0843
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0401 - mean_absolute_error: 0.1217 - val_loss: 0.0262 - val_mean_absolute_error: 0.0928
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0446 - mean_absolute_error: 0.1340 - val_loss: 0.0266 - val_mean_absolute_error: 0.0962
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0474 - mean_absolute_error: 0.1500 - val_loss: 0.0249 - val_mean_absolute_error: 0.0813
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0570 - mean_absolute_error: 0.1586 - val_loss: 0.0307 - val_mean_absolute_error: 0.0914
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0403 - mean_absolute_error: 0.1297 - val_loss: 0.0271 - val_mean_absolute_error: 0.0813
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0430 - mean_absolute_error: 0.1418 - val_loss: 0.0266 - val_mean_absolute_error: 0.0917
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0501 - mean_absolute_error: 0.1288 - val_loss: 0.0307 - val_mean_absolute_error: 0.0848
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0398 - mean_absolute_error: 0.1204 - val_loss: 0.0347 - val_mean_absolute_error: 0.1077
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0393 - mean_absolute_error: 0.1278 - val_loss: 0.0341 - val_mean_absolute_error: 0.1011
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0461 - mean_absolute_error: 0.1210 - val_loss: 0.0333 - val_mean_absolute_error: 0.0971
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0417 - mean_absolute_error: 0.1310 - val_loss: 0.0364 - val_mean_absolute_error: 0.1054
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0438 - mean_absolute_error: 0.1359 - val_loss: 0.0396 - val_mean_absolute_error: 0.1160
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0483 - mean_absolute_error: 0.1378 - val_loss: 0.0309 - val_mean_absolute_error: 0.1023
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0390 - mean_absolute_error: 0.1212 - val_loss: 0.0294 - val_mean_absolute_error: 0.1041
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0371 - mean_absolute_error: 0.1160 - val_loss: 0.0289 - val_mean_absolute_error: 0.0944
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0464 - mean_absolute_error: 0.1243 - val_loss: 0.0306 - val_mean_absolute_error: 0.0973
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - loss: 0.0386 - mean_absolute_error: 0.1247 - val_loss: 0.0284 - val_mean_absolute_error: 0.0898
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0373 - mean_absolute_error: 0.1152 - val_loss: 0.0262 - val_mean_absolute_error: 0.0838
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0427 - mean_absolute_error: 0.1259 - val_loss: 0.0292 - val_mean_absolute_error: 0.0914
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0325 - mean_absolute_error: 0.1123 - val_loss: 0.0347 - val_mean_absolute_error: 0.1116
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0529 - mean_absolute_error: 0.1458 - val_loss: 0.0248 - val_mean_absolute_error: 0.0787
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0434 - mean_absolute_error: 0.1281 - val_loss: 0.0225 - val_mean_absolute_error: 0.0754
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0375 - mean_absolute_error: 0.1029 - val_loss: 0.0236 - val_mean_absolute_error: 0.0737
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0363 - mean_absolute_error: 0.1077 - val_loss: 0.0259 - val_mean_absolute_error: 0.0780
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0494 - mean_absolute_error: 0.1351 - val_loss: 0.0283 - val_mean_absolute_error: 0.0882
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0360 - mean_absolute_error: 0.1167 - val_loss: 0.0323 - val_mean_absolute_error: 0.1021
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0340 - mean_absolute_error: 0.1127 - val_loss: 0.0436 - val_mean_absolute_error: 0.1382
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0457 - mean_absolute_error: 0.1296 - val_loss: 0.0360 - val_mean_absolute_error: 0.1181
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0468 - mean_absolute_error: 0.1260 - val_loss: 0.0315 - val_mean_absolute_error: 0.1050
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0495 - mean_absolute_error: 0.1427 - val_loss: 0.0326 - val_mean_absolute_error: 0.1052
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0496 - mean_absolute_error: 0.1332 - val_loss: 0.0326 - val_mean_absolute_error: 0.1016
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0392 - mean_absolute_error: 0.1183 - val_loss: 0.0356 - val_mean_absolute_error: 0.1105
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0440 - mean_absolute_error: 0.1202 - val_loss: 0.0351 - val_mean_absolute_error: 0.1037
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0423 - mean_absolute_error: 0.1284 - val_loss: 0.0344 - val_mean_absolute_error: 0.1011
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0457 - mean_absolute_error: 0.1337 - val_loss: 0.0323 - val_mean_absolute_error: 0.1051
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0392 - mean_absolute_error: 0.1193 - val_loss: 0.0311 - val_mean_absolute_error: 0.1084
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0387 - mean_absolute_error: 0.1216 - val_loss: 0.0289 - val_mean_absolute_error: 0.1063
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0370 - mean_absolute_error: 0.1148 - val_loss: 0.0304 - val_mean_absolute_error: 0.1050
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0468 - mean_absolute_error: 0.1309 - val_loss: 0.0285 - val_mean_absolute_error: 0.0998
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0348 - mean_absolute_error: 0.1074 - val_loss: 0.0276 - val_mean_absolute_error: 0.0955
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0341 - mean_absolute_error: 0.1068 - val_loss: 0.0330 - val_mean_absolute_error: 0.1096

FOLD 4
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.0543 - mean_absolute_error: 0.1527 - val_loss: 0.0194 - val_mean_absolute_error: 0.0602
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0456 - mean_absolute_error: 0.1400 - val_loss: 0.0194 - val_mean_absolute_error: 0.0545
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0406 - mean_absolute_error: 0.1244 - val_loss: 0.0193 - val_mean_absolute_error: 0.0573
Epoch 4/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.0455 - mean_absolute_error: 0.1315
2025-08-09 17:09:16.284872: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:09:16.285245: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0405 - mean_absolute_error: 0.1195 - val_loss: 0.0239 - val_mean_absolute_error: 0.0814
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0481 - mean_absolute_error: 0.1368 - val_loss: 0.0303 - val_mean_absolute_error: 0.1049
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0450 - mean_absolute_error: 0.1280 - val_loss: 0.0313 - val_mean_absolute_error: 0.1092
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0396 - mean_absolute_error: 0.1257 - val_loss: 0.0211 - val_mean_absolute_error: 0.0685
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0487 - mean_absolute_error: 0.1401 - val_loss: 0.0194 - val_mean_absolute_error: 0.0602
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0441 - mean_absolute_error: 0.1342 - val_loss: 0.0192 - val_mean_absolute_error: 0.0570
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0324 - mean_absolute_error: 0.0998 - val_loss: 0.0288 - val_mean_absolute_error: 0.1011
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0517 - mean_absolute_error: 0.1389 - val_loss: 0.0218 - val_mean_absolute_error: 0.0691
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0439 - mean_absolute_error: 0.1295 - val_loss: 0.0198 - val_mean_absolute_error: 0.0617
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0321 - mean_absolute_error: 0.1101 - val_loss: 0.0208 - val_mean_absolute_error: 0.0667
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0512 - mean_absolute_error: 0.1390 - val_loss: 0.0230 - val_mean_absolute_error: 0.0772
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0309 - mean_absolute_error: 0.1053 - val_loss: 0.0245 - val_mean_absolute_error: 0.0822
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0416 - mean_absolute_error: 0.1219 - val_loss: 0.0230 - val_mean_absolute_error: 0.0782
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0446 - mean_absolute_error: 0.1285 - val_loss: 0.0287 - val_mean_absolute_error: 0.1028
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0343 - mean_absolute_error: 0.1097 - val_loss: 0.0287 - val_mean_absolute_error: 0.0977
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0418 - mean_absolute_error: 0.1128 - val_loss: 0.0230 - val_mean_absolute_error: 0.0744
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0378 - mean_absolute_error: 0.1143 - val_loss: 0.0210 - val_mean_absolute_error: 0.0722
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0614 - mean_absolute_error: 0.1491 - val_loss: 0.0248 - val_mean_absolute_error: 0.0837
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0450 - mean_absolute_error: 0.1338 - val_loss: 0.0242 - val_mean_absolute_error: 0.0815
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0374 - mean_absolute_error: 0.1119 - val_loss: 0.0257 - val_mean_absolute_error: 0.0918
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0447 - mean_absolute_error: 0.1354 - val_loss: 0.0311 - val_mean_absolute_error: 0.1100
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0565 - mean_absolute_error: 0.1340 - val_loss: 0.0350 - val_mean_absolute_error: 0.1165
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0333 - mean_absolute_error: 0.1100 - val_loss: 0.0256 - val_mean_absolute_error: 0.0898
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0380 - mean_absolute_error: 0.1183 - val_loss: 0.0239 - val_mean_absolute_error: 0.0870
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0419 - mean_absolute_error: 0.1243 - val_loss: 0.0294 - val_mean_absolute_error: 0.1082
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0364 - mean_absolute_error: 0.1150 - val_loss: 0.0323 - val_mean_absolute_error: 0.1152
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0434 - mean_absolute_error: 0.1262 - val_loss: 0.0238 - val_mean_absolute_error: 0.0818
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0432 - mean_absolute_error: 0.1342 - val_loss: 0.0218 - val_mean_absolute_error: 0.0768
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0555 - mean_absolute_error: 0.1411 - val_loss: 0.0248 - val_mean_absolute_error: 0.0855
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0461 - mean_absolute_error: 0.1378 - val_loss: 0.0250 - val_mean_absolute_error: 0.0880
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0443 - mean_absolute_error: 0.1184 - val_loss: 0.0322 - val_mean_absolute_error: 0.1145
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0299 - mean_absolute_error: 0.0924 - val_loss: 0.0387 - val_mean_absolute_error: 0.1341
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0339 - mean_absolute_error: 0.1043 - val_loss: 0.0298 - val_mean_absolute_error: 0.1050
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0374 - mean_absolute_error: 0.1129 - val_loss: 0.0244 - val_mean_absolute_error: 0.0869
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0486 - mean_absolute_error: 0.1350 - val_loss: 0.0238 - val_mean_absolute_error: 0.0816
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0350 - mean_absolute_error: 0.1129 - val_loss: 0.0425 - val_mean_absolute_error: 0.1435
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0555 - mean_absolute_error: 0.1639 - val_loss: 0.0365 - val_mean_absolute_error: 0.1235
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0445 - mean_absolute_error: 0.1322 - val_loss: 0.0287 - val_mean_absolute_error: 0.1021
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0407 - mean_absolute_error: 0.1308 - val_loss: 0.0251 - val_mean_absolute_error: 0.0906
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0306 - mean_absolute_error: 0.1015 - val_loss: 0.0309 - val_mean_absolute_error: 0.1139
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0516 - mean_absolute_error: 0.1356 - val_loss: 0.0260 - val_mean_absolute_error: 0.0937
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0453 - mean_absolute_error: 0.1361 - val_loss: 0.0283 - val_mean_absolute_error: 0.1031
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0426 - mean_absolute_error: 0.1300 - val_loss: 0.0269 - val_mean_absolute_error: 0.0956
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0297 - mean_absolute_error: 0.0954 - val_loss: 0.0284 - val_mean_absolute_error: 0.0975
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0345 - mean_absolute_error: 0.1037 - val_loss: 0.0273 - val_mean_absolute_error: 0.0952
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0477 - mean_absolute_error: 0.1432 - val_loss: 0.0282 - val_mean_absolute_error: 0.0975
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0502 - mean_absolute_error: 0.1302 - val_loss: 0.0309 - val_mean_absolute_error: 0.1021

FOLD 5
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.0327 - mean_absolute_error: 0.1071 - val_loss: 0.0268 - val_mean_absolute_error: 0.0746
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0391 - mean_absolute_error: 0.1206 - val_loss: 0.0322 - val_mean_absolute_error: 0.1027
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0328 - mean_absolute_error: 0.1070 - val_loss: 0.0231 - val_mean_absolute_error: 0.0772
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0357 - mean_absolute_error: 0.1182 - val_loss: 0.0217 - val_mean_absolute_error: 0.0690
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0337 - mean_absolute_error: 0.1153 - val_loss: 0.0267 - val_mean_absolute_error: 0.0815
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0342 - mean_absolute_error: 0.0998 - val_loss: 0.0322 - val_mean_absolute_error: 0.0956
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0359 - mean_absolute_error: 0.1151 - val_loss: 0.0249 - val_mean_absolute_error: 0.0765
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0358 - mean_absolute_error: 0.1198 - val_loss: 0.0204 - val_mean_absolute_error: 0.0691
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0305 - mean_absolute_error: 0.0993 - val_loss: 0.0197 - val_mean_absolute_error: 0.0681
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0318 - mean_absolute_error: 0.0979 - val_loss: 0.0256 - val_mean_absolute_error: 0.0843
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0357 - mean_absolute_error: 0.1013 - val_loss: 0.0309 - val_mean_absolute_error: 0.0986
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 33ms/step - loss: 0.0367 - mean_absolute_error: 0.1126 - val_loss: 0.0294 - val_mean_absolute_error: 0.0981
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - loss: 0.0331 - mean_absolute_error: 0.1060 - val_loss: 0.0236 - val_mean_absolute_error: 0.0832
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.0318 - mean_absolute_error: 0.0997 - val_loss: 0.0232 - val_mean_absolute_error: 0.0825
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 84ms/step - loss: 0.0348 - mean_absolute_error: 0.0993 - val_loss: 0.0259 - val_mean_absolute_error: 0.0887
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0355 - mean_absolute_error: 0.1114 - val_loss: 0.0300 - val_mean_absolute_error: 0.0973
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0314 - mean_absolute_error: 0.0995 - val_loss: 0.0331 - val_mean_absolute_error: 0.1060
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0412 - mean_absolute_error: 0.1137 - val_loss: 0.0247 - val_mean_absolute_error: 0.0861
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.0417 - mean_absolute_error: 0.1243 - val_loss: 0.0198 - val_mean_absolute_error: 0.0694
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - loss: 0.0479 - mean_absolute_error: 0.1314 - val_loss: 0.0207 - val_mean_absolute_error: 0.0712
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0316 - mean_absolute_error: 0.1049 - val_loss: 0.0248 - val_mean_absolute_error: 0.0842
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - loss: 0.0435 - mean_absolute_error: 0.0944 - val_loss: 0.0224 - val_mean_absolute_error: 0.0750
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - loss: 0.0369 - mean_absolute_error: 0.1099 - val_loss: 0.0200 - val_mean_absolute_error: 0.0660
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0394 - mean_absolute_error: 0.1171 - val_loss: 0.0188 - val_mean_absolute_error: 0.0566
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 27ms/step - loss: 0.0488 - mean_absolute_error: 0.1261 - val_loss: 0.0206 - val_mean_absolute_error: 0.0609
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0344 - mean_absolute_error: 0.1068 - val_loss: 0.0214 - val_mean_absolute_error: 0.0702
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0377 - mean_absolute_error: 0.1077 - val_loss: 0.0296 - val_mean_absolute_error: 0.0960
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0292 - mean_absolute_error: 0.0932 - val_loss: 0.0322 - val_mean_absolute_error: 0.1012
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0363 - mean_absolute_error: 0.1124 - val_loss: 0.0241 - val_mean_absolute_error: 0.0832
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0399 - mean_absolute_error: 0.1120 - val_loss: 0.0197 - val_mean_absolute_error: 0.0575
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0276 - mean_absolute_error: 0.0868 - val_loss: 0.0202 - val_mean_absolute_error: 0.0666
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0307 - mean_absolute_error: 0.1027 - val_loss: 0.0212 - val_mean_absolute_error: 0.0707
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0301 - mean_absolute_error: 0.0935 - val_loss: 0.0209 - val_mean_absolute_error: 0.0639
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0345 - mean_absolute_error: 0.1039 - val_loss: 0.0311 - val_mean_absolute_error: 0.1070
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0379 - mean_absolute_error: 0.1167 - val_loss: 0.0273 - val_mean_absolute_error: 0.0892
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0265 - mean_absolute_error: 0.0881 - val_loss: 0.0326 - val_mean_absolute_error: 0.1186
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0391 - mean_absolute_error: 0.1254 - val_loss: 0.0204 - val_mean_absolute_error: 0.0565
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0346 - mean_absolute_error: 0.1036 - val_loss: 0.0189 - val_mean_absolute_error: 0.0579
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0294 - mean_absolute_error: 0.0952 - val_loss: 0.0220 - val_mean_absolute_error: 0.0705
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0324 - mean_absolute_error: 0.1089 - val_loss: 0.0243 - val_mean_absolute_error: 0.0785
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0324 - mean_absolute_error: 0.1021 - val_loss: 0.0238 - val_mean_absolute_error: 0.0785
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0269 - mean_absolute_error: 0.0903 - val_loss: 0.0246 - val_mean_absolute_error: 0.0831
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0285 - mean_absolute_error: 0.0915 - val_loss: 0.0224 - val_mean_absolute_error: 0.0768
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0347 - mean_absolute_error: 0.1173 - val_loss: 0.0252 - val_mean_absolute_error: 0.0906
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0372 - mean_absolute_error: 0.1123 - val_loss: 0.0296 - val_mean_absolute_error: 0.1012
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0334 - mean_absolute_error: 0.1004 - val_loss: 0.0265 - val_mean_absolute_error: 0.0864
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0445 - mean_absolute_error: 0.1201 - val_loss: 0.0240 - val_mean_absolute_error: 0.0766
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0309 - mean_absolute_error: 0.1010 - val_loss: 0.0235 - val_mean_absolute_error: 0.0835
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0280 - mean_absolute_error: 0.0906 - val_loss: 0.0259 - val_mean_absolute_error: 0.0951
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0348 - mean_absolute_error: 0.1103 - val_loss: 0.0223 - val_mean_absolute_error: 0.0692
Validation losses: [0.3190247118473053, 0.061459314078092575, 0.03299913927912712, 0.03092125616967678, 0.022289220243692398]
HPS: {'player_emb_dim': 32, 'dense_units': 48, 'dense_units_2': 48, 'learning_rate': 0.01, 'dropout_rate': 0.1, 'dropout_rate_2': 0.2, 'dropout_rate_inter': 0.30000000000000004, 'interaction_scale': 3}. MSE during RandomSearch: 0.12901265919208527. Starting evaluation across all k folds...

FOLD 1
Epoch 1/50
2025-08-09 17:09:21.696909: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:09:21.697311: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
3/3 ━━━━━━━━━━━━━━━━━━━━ 2s 92ms/step - loss: 1.9225 - mean_absolute_error: 1.0984 - val_loss: 1.0945 - val_mean_absolute_error: 0.8356
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.8145 - mean_absolute_error: 0.6992 - val_loss: 0.7295 - val_mean_absolute_error: 0.6947
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.6631 - mean_absolute_error: 0.6494 - val_loss: 0.5892 - val_mean_absolute_error: 0.6812
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.4507 - mean_absolute_error: 0.5337 - val_loss: 1.0486 - val_mean_absolute_error: 0.7746
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.5773 - mean_absolute_error: 0.5957 - val_loss: 0.4045 - val_mean_absolute_error: 0.5892
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.3405 - mean_absolute_error: 0.4394 - val_loss: 0.3651 - val_mean_absolute_error: 0.4938
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.2356 - mean_absolute_error: 0.3562 - val_loss: 0.3067 - val_mean_absolute_error: 0.4592
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1778 - mean_absolute_error: 0.3078 - val_loss: 0.3666 - val_mean_absolute_error: 0.4800
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1928 - mean_absolute_error: 0.3232 - val_loss: 0.3245 - val_mean_absolute_error: 0.4552
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1230 - mean_absolute_error: 0.2535 - val_loss: 0.2669 - val_mean_absolute_error: 0.4270
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1129 - mean_absolute_error: 0.2419 - val_loss: 0.2384 - val_mean_absolute_error: 0.4035
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1527 - mean_absolute_error: 0.2671 - val_loss: 0.1943 - val_mean_absolute_error: 0.3645
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0890 - mean_absolute_error: 0.2144 - val_loss: 0.1973 - val_mean_absolute_error: 0.3710
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1008 - mean_absolute_error: 0.2404 - val_loss: 0.2045 - val_mean_absolute_error: 0.3630
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0800 - mean_absolute_error: 0.1970 - val_loss: 0.1785 - val_mean_absolute_error: 0.3276
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0953 - mean_absolute_error: 0.2238 - val_loss: 0.1640 - val_mean_absolute_error: 0.3225
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0938 - mean_absolute_error: 0.2015 - val_loss: 0.2053 - val_mean_absolute_error: 0.3735
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0842 - mean_absolute_error: 0.2026 - val_loss: 0.2247 - val_mean_absolute_error: 0.3833
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0774 - mean_absolute_error: 0.1998 - val_loss: 0.1943 - val_mean_absolute_error: 0.3769
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0888 - mean_absolute_error: 0.2129 - val_loss: 0.1829 - val_mean_absolute_error: 0.3531
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0806 - mean_absolute_error: 0.1955 - val_loss: 0.1741 - val_mean_absolute_error: 0.3329
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0970 - mean_absolute_error: 0.2237 - val_loss: 0.1857 - val_mean_absolute_error: 0.3423
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0705 - mean_absolute_error: 0.1844 - val_loss: 0.1857 - val_mean_absolute_error: 0.3458
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0765 - mean_absolute_error: 0.1842 - val_loss: 0.1817 - val_mean_absolute_error: 0.3501
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0487 - mean_absolute_error: 0.1549 - val_loss: 0.1977 - val_mean_absolute_error: 0.3686
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0644 - mean_absolute_error: 0.1731 - val_loss: 0.2062 - val_mean_absolute_error: 0.3679
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.0614 - mean_absolute_error: 0.1575 - val_loss: 0.1670 - val_mean_absolute_error: 0.3362
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0661 - mean_absolute_error: 0.1938 - val_loss: 0.1555 - val_mean_absolute_error: 0.3072
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0710 - mean_absolute_error: 0.1773 - val_loss: 0.1460 - val_mean_absolute_error: 0.2809
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0696 - mean_absolute_error: 0.1549 - val_loss: 0.1394 - val_mean_absolute_error: 0.2987
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0479 - mean_absolute_error: 0.1467 - val_loss: 0.1674 - val_mean_absolute_error: 0.3224
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0445 - mean_absolute_error: 0.1340 - val_loss: 0.1547 - val_mean_absolute_error: 0.3207
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0623 - mean_absolute_error: 0.1688 - val_loss: 0.1583 - val_mean_absolute_error: 0.3304
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0505 - mean_absolute_error: 0.1462 - val_loss: 0.1743 - val_mean_absolute_error: 0.3523
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0584 - mean_absolute_error: 0.1741 - val_loss: 0.1716 - val_mean_absolute_error: 0.3458
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0383 - mean_absolute_error: 0.1302 - val_loss: 0.1772 - val_mean_absolute_error: 0.3291
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0605 - mean_absolute_error: 0.1515 - val_loss: 0.1646 - val_mean_absolute_error: 0.3096
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0901 - mean_absolute_error: 0.2104 - val_loss: 0.1651 - val_mean_absolute_error: 0.3218
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0422 - mean_absolute_error: 0.1297 - val_loss: 0.1916 - val_mean_absolute_error: 0.3567
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0527 - mean_absolute_error: 0.1469 - val_loss: 0.1916 - val_mean_absolute_error: 0.3687
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0538 - mean_absolute_error: 0.1454 - val_loss: 0.1806 - val_mean_absolute_error: 0.3652
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0597 - mean_absolute_error: 0.1648 - val_loss: 0.1872 - val_mean_absolute_error: 0.3550
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0486 - mean_absolute_error: 0.1441 - val_loss: 0.1794 - val_mean_absolute_error: 0.3439
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0831 - mean_absolute_error: 0.1731 - val_loss: 0.1512 - val_mean_absolute_error: 0.3145
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0423 - mean_absolute_error: 0.1327 - val_loss: 0.1566 - val_mean_absolute_error: 0.3263
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0522 - mean_absolute_error: 0.1404 - val_loss: 0.2428 - val_mean_absolute_error: 0.3944
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0742 - mean_absolute_error: 0.1863 - val_loss: 0.1741 - val_mean_absolute_error: 0.3384
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0396 - mean_absolute_error: 0.1306 - val_loss: 0.1437 - val_mean_absolute_error: 0.3083
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0406 - mean_absolute_error: 0.1305 - val_loss: 0.1820 - val_mean_absolute_error: 0.3290
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0661 - mean_absolute_error: 0.1728 - val_loss: 0.1560 - val_mean_absolute_error: 0.3077

FOLD 2
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.0577 - mean_absolute_error: 0.1603 - val_loss: 0.0480 - val_mean_absolute_error: 0.1238
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0634 - mean_absolute_error: 0.1740 - val_loss: 0.0484 - val_mean_absolute_error: 0.1285
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0682 - mean_absolute_error: 0.1776 - val_loss: 0.0491 - val_mean_absolute_error: 0.1426
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0554 - mean_absolute_error: 0.1567 - val_loss: 0.0496 - val_mean_absolute_error: 0.1420
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0562 - mean_absolute_error: 0.1615 - val_loss: 0.0736 - val_mean_absolute_error: 0.2041
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0747 - mean_absolute_error: 0.1959 - val_loss: 0.0917 - val_mean_absolute_error: 0.2082
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0977 - mean_absolute_error: 0.2255 - val_loss: 0.0745 - val_mean_absolute_error: 0.1997
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0694 - mean_absolute_error: 0.1896 - val_loss: 0.0473 - val_mean_absolute_error: 0.1536
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0478 - mean_absolute_error: 0.1528 - val_loss: 0.0565 - val_mean_absolute_error: 0.1491
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0682 - mean_absolute_error: 0.1581 - val_loss: 0.0475 - val_mean_absolute_error: 0.1428
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0515 - mean_absolute_error: 0.1503 - val_loss: 0.0509 - val_mean_absolute_error: 0.1441
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0414 - mean_absolute_error: 0.1281 - val_loss: 0.0647 - val_mean_absolute_error: 0.1596
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0517 - mean_absolute_error: 0.1336 - val_loss: 0.0645 - val_mean_absolute_error: 0.1913
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0627 - mean_absolute_error: 0.1673 - val_loss: 0.0696 - val_mean_absolute_error: 0.1874
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0477 - mean_absolute_error: 0.1445 - val_loss: 0.0879 - val_mean_absolute_error: 0.1967
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0583 - mean_absolute_error: 0.1443 - val_loss: 0.0679 - val_mean_absolute_error: 0.1864
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0395 - mean_absolute_error: 0.1332 - val_loss: 0.0795 - val_mean_absolute_error: 0.1918
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0572 - mean_absolute_error: 0.1570 - val_loss: 0.1057 - val_mean_absolute_error: 0.2070
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0467 - mean_absolute_error: 0.1380 - val_loss: 0.0819 - val_mean_absolute_error: 0.2224
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0592 - mean_absolute_error: 0.1643 - val_loss: 0.0918 - val_mean_absolute_error: 0.2012
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0454 - mean_absolute_error: 0.1198 - val_loss: 0.0947 - val_mean_absolute_error: 0.1886
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0452 - mean_absolute_error: 0.1360 - val_loss: 0.0674 - val_mean_absolute_error: 0.1910
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0502 - mean_absolute_error: 0.1447 - val_loss: 0.0885 - val_mean_absolute_error: 0.2244
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0654 - mean_absolute_error: 0.1677 - val_loss: 0.1178 - val_mean_absolute_error: 0.2369
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0621 - mean_absolute_error: 0.1801 - val_loss: 0.0909 - val_mean_absolute_error: 0.2105
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0530 - mean_absolute_error: 0.1454 - val_loss: 0.0762 - val_mean_absolute_error: 0.1868
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0430 - mean_absolute_error: 0.1392 - val_loss: 0.0918 - val_mean_absolute_error: 0.2063
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0481 - mean_absolute_error: 0.1478 - val_loss: 0.0880 - val_mean_absolute_error: 0.1844
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0596 - mean_absolute_error: 0.1598 - val_loss: 0.0779 - val_mean_absolute_error: 0.1831
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0371 - mean_absolute_error: 0.1205 - val_loss: 0.0838 - val_mean_absolute_error: 0.1777
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0655 - mean_absolute_error: 0.1780 - val_loss: 0.0867 - val_mean_absolute_error: 0.2140
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0468 - mean_absolute_error: 0.1423 - val_loss: 0.0703 - val_mean_absolute_error: 0.1967
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0484 - mean_absolute_error: 0.1411 - val_loss: 0.1224 - val_mean_absolute_error: 0.2287
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0507 - mean_absolute_error: 0.1531 - val_loss: 0.1048 - val_mean_absolute_error: 0.1978
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0415 - mean_absolute_error: 0.1324 - val_loss: 0.0925 - val_mean_absolute_error: 0.2109
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0702 - mean_absolute_error: 0.1903 - val_loss: 0.0892 - val_mean_absolute_error: 0.2028
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0384 - mean_absolute_error: 0.1270 - val_loss: 0.0951 - val_mean_absolute_error: 0.1918
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0547 - mean_absolute_error: 0.1424 - val_loss: 0.0916 - val_mean_absolute_error: 0.1900
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0369 - mean_absolute_error: 0.1164 - val_loss: 0.0865 - val_mean_absolute_error: 0.2059
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0323 - mean_absolute_error: 0.1223 - val_loss: 0.1061 - val_mean_absolute_error: 0.2123
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0516 - mean_absolute_error: 0.1477 - val_loss: 0.1169 - val_mean_absolute_error: 0.2171
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0619 - mean_absolute_error: 0.1651 - val_loss: 0.0952 - val_mean_absolute_error: 0.1929
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0519 - mean_absolute_error: 0.1488 - val_loss: 0.0697 - val_mean_absolute_error: 0.1865
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0503 - mean_absolute_error: 0.1464 - val_loss: 0.1113 - val_mean_absolute_error: 0.2134
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0418 - mean_absolute_error: 0.1332 - val_loss: 0.1031 - val_mean_absolute_error: 0.2119
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0362 - mean_absolute_error: 0.1162 - val_loss: 0.0889 - val_mean_absolute_error: 0.2139
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0533 - mean_absolute_error: 0.1547 - val_loss: 0.1182 - val_mean_absolute_error: 0.2287
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0555 - mean_absolute_error: 0.1522 - val_loss: 0.1419 - val_mean_absolute_error: 0.2571
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0567 - mean_absolute_error: 0.1490 - val_loss: 0.0889 - val_mean_absolute_error: 0.2308
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - loss: 0.0825 - mean_absolute_error: 0.2141 - val_loss: 0.0973 - val_mean_absolute_error: 0.2243

FOLD 3
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.0717 - mean_absolute_error: 0.1771 - val_loss: 0.0928 - val_mean_absolute_error: 0.2290
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1057 - mean_absolute_error: 0.2440 - val_loss: 0.0236 - val_mean_absolute_error: 0.0935
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0853 - mean_absolute_error: 0.2079 - val_loss: 0.0298 - val_mean_absolute_error: 0.1193
Epoch 4/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0495 - mean_absolute_error: 0.1539
2025-08-09 17:09:28.191566: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:09:28.191969: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0540 - mean_absolute_error: 0.1567 - val_loss: 0.0592 - val_mean_absolute_error: 0.1782
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0775 - mean_absolute_error: 0.1747 - val_loss: 0.0400 - val_mean_absolute_error: 0.1435
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0365 - mean_absolute_error: 0.1272 - val_loss: 0.0495 - val_mean_absolute_error: 0.1616
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0365 - mean_absolute_error: 0.1287 - val_loss: 0.0597 - val_mean_absolute_error: 0.1773
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0511 - mean_absolute_error: 0.1393 - val_loss: 0.0528 - val_mean_absolute_error: 0.1690
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0482 - mean_absolute_error: 0.1335 - val_loss: 0.0561 - val_mean_absolute_error: 0.1723
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0437 - mean_absolute_error: 0.1435 - val_loss: 0.0534 - val_mean_absolute_error: 0.1715
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0342 - mean_absolute_error: 0.1158 - val_loss: 0.0533 - val_mean_absolute_error: 0.1695
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0352 - mean_absolute_error: 0.1232 - val_loss: 0.0514 - val_mean_absolute_error: 0.1666
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0401 - mean_absolute_error: 0.1282 - val_loss: 0.0491 - val_mean_absolute_error: 0.1632
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0373 - mean_absolute_error: 0.1225 - val_loss: 0.0534 - val_mean_absolute_error: 0.1629
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0393 - mean_absolute_error: 0.1305 - val_loss: 0.0535 - val_mean_absolute_error: 0.1811
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0382 - mean_absolute_error: 0.1334 - val_loss: 0.0419 - val_mean_absolute_error: 0.1480
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0366 - mean_absolute_error: 0.1222 - val_loss: 0.0634 - val_mean_absolute_error: 0.1769
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0443 - mean_absolute_error: 0.1477 - val_loss: 0.0516 - val_mean_absolute_error: 0.1661
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0463 - mean_absolute_error: 0.1395 - val_loss: 0.0422 - val_mean_absolute_error: 0.1479
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0381 - mean_absolute_error: 0.1255 - val_loss: 0.0588 - val_mean_absolute_error: 0.1662
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0441 - mean_absolute_error: 0.1215 - val_loss: 0.0614 - val_mean_absolute_error: 0.1818
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0367 - mean_absolute_error: 0.1222 - val_loss: 0.0599 - val_mean_absolute_error: 0.1907
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0354 - mean_absolute_error: 0.1155 - val_loss: 0.0705 - val_mean_absolute_error: 0.1974
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0444 - mean_absolute_error: 0.1320 - val_loss: 0.0598 - val_mean_absolute_error: 0.1854
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0351 - mean_absolute_error: 0.1140 - val_loss: 0.0550 - val_mean_absolute_error: 0.1734
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0260 - mean_absolute_error: 0.0971 - val_loss: 0.0483 - val_mean_absolute_error: 0.1611
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0403 - mean_absolute_error: 0.1240 - val_loss: 0.0511 - val_mean_absolute_error: 0.1702
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0351 - mean_absolute_error: 0.1131 - val_loss: 0.0576 - val_mean_absolute_error: 0.1647
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0495 - mean_absolute_error: 0.1261 - val_loss: 0.0487 - val_mean_absolute_error: 0.1493
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0366 - mean_absolute_error: 0.1292 - val_loss: 0.0571 - val_mean_absolute_error: 0.1824
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0422 - mean_absolute_error: 0.1284 - val_loss: 0.0560 - val_mean_absolute_error: 0.1770
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0507 - mean_absolute_error: 0.1494 - val_loss: 0.0593 - val_mean_absolute_error: 0.1748
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0419 - mean_absolute_error: 0.1268 - val_loss: 0.0615 - val_mean_absolute_error: 0.2012
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0402 - mean_absolute_error: 0.1318 - val_loss: 0.0774 - val_mean_absolute_error: 0.2035
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0351 - mean_absolute_error: 0.1249 - val_loss: 0.0594 - val_mean_absolute_error: 0.1940
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0297 - mean_absolute_error: 0.1065 - val_loss: 0.0683 - val_mean_absolute_error: 0.2073
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0291 - mean_absolute_error: 0.0958 - val_loss: 0.0984 - val_mean_absolute_error: 0.2423
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0610 - mean_absolute_error: 0.1543 - val_loss: 0.0631 - val_mean_absolute_error: 0.2027
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0361 - mean_absolute_error: 0.1231 - val_loss: 0.0702 - val_mean_absolute_error: 0.2049
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0308 - mean_absolute_error: 0.1110 - val_loss: 0.0780 - val_mean_absolute_error: 0.2166
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0293 - mean_absolute_error: 0.0923 - val_loss: 0.0493 - val_mean_absolute_error: 0.1673
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0419 - mean_absolute_error: 0.1354 - val_loss: 0.0582 - val_mean_absolute_error: 0.1946
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0361 - mean_absolute_error: 0.1182 - val_loss: 0.0742 - val_mean_absolute_error: 0.1907
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0378 - mean_absolute_error: 0.1194 - val_loss: 0.0655 - val_mean_absolute_error: 0.1747
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0374 - mean_absolute_error: 0.1254 - val_loss: 0.0576 - val_mean_absolute_error: 0.1920
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0432 - mean_absolute_error: 0.1322 - val_loss: 0.0567 - val_mean_absolute_error: 0.1776
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0349 - mean_absolute_error: 0.1151 - val_loss: 0.0799 - val_mean_absolute_error: 0.2120
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0334 - mean_absolute_error: 0.1089 - val_loss: 0.0733 - val_mean_absolute_error: 0.2105
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0328 - mean_absolute_error: 0.1108 - val_loss: 0.0629 - val_mean_absolute_error: 0.1947
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0477 - mean_absolute_error: 0.1345 - val_loss: 0.0934 - val_mean_absolute_error: 0.2337

FOLD 4
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.0520 - mean_absolute_error: 0.1484 - val_loss: 0.0242 - val_mean_absolute_error: 0.1044
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0446 - mean_absolute_error: 0.1496 - val_loss: 0.0177 - val_mean_absolute_error: 0.0770
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0687 - mean_absolute_error: 0.1796 - val_loss: 0.0267 - val_mean_absolute_error: 0.1093
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0559 - mean_absolute_error: 0.1553 - val_loss: 0.0254 - val_mean_absolute_error: 0.1131
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0353 - mean_absolute_error: 0.1267 - val_loss: 0.0188 - val_mean_absolute_error: 0.0845
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0490 - mean_absolute_error: 0.1530 - val_loss: 0.0298 - val_mean_absolute_error: 0.1176
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0569 - mean_absolute_error: 0.1441 - val_loss: 0.0408 - val_mean_absolute_error: 0.1436
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0467 - mean_absolute_error: 0.1349 - val_loss: 0.0166 - val_mean_absolute_error: 0.0720
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0512 - mean_absolute_error: 0.1610 - val_loss: 0.0394 - val_mean_absolute_error: 0.1431
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0285 - mean_absolute_error: 0.0994 - val_loss: 0.0524 - val_mean_absolute_error: 0.1747
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0452 - mean_absolute_error: 0.1488 - val_loss: 0.0370 - val_mean_absolute_error: 0.1344
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0363 - mean_absolute_error: 0.1309 - val_loss: 0.0494 - val_mean_absolute_error: 0.1623
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0410 - mean_absolute_error: 0.1284 - val_loss: 0.0306 - val_mean_absolute_error: 0.1122
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0426 - mean_absolute_error: 0.1400 - val_loss: 0.0321 - val_mean_absolute_error: 0.1162
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0541 - mean_absolute_error: 0.1534 - val_loss: 0.0381 - val_mean_absolute_error: 0.1382
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0461 - mean_absolute_error: 0.1432 - val_loss: 0.0257 - val_mean_absolute_error: 0.1055
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0260 - mean_absolute_error: 0.1061 - val_loss: 0.0248 - val_mean_absolute_error: 0.0955
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0304 - mean_absolute_error: 0.1062 - val_loss: 0.0230 - val_mean_absolute_error: 0.0974
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0326 - mean_absolute_error: 0.1139 - val_loss: 0.0208 - val_mean_absolute_error: 0.0935
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0273 - mean_absolute_error: 0.1013 - val_loss: 0.0175 - val_mean_absolute_error: 0.0816
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0362 - mean_absolute_error: 0.1197 - val_loss: 0.0208 - val_mean_absolute_error: 0.0884
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0297 - mean_absolute_error: 0.1106 - val_loss: 0.0317 - val_mean_absolute_error: 0.1241
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0494 - mean_absolute_error: 0.1374 - val_loss: 0.0206 - val_mean_absolute_error: 0.0808
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0301 - mean_absolute_error: 0.1063 - val_loss: 0.0402 - val_mean_absolute_error: 0.1534
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0330 - mean_absolute_error: 0.1191 - val_loss: 0.0292 - val_mean_absolute_error: 0.1284
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0365 - mean_absolute_error: 0.1225 - val_loss: 0.0292 - val_mean_absolute_error: 0.1090
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0300 - mean_absolute_error: 0.1116 - val_loss: 0.0448 - val_mean_absolute_error: 0.1504
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0358 - mean_absolute_error: 0.1203 - val_loss: 0.0277 - val_mean_absolute_error: 0.1090
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0360 - mean_absolute_error: 0.1202 - val_loss: 0.0395 - val_mean_absolute_error: 0.1369
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0426 - mean_absolute_error: 0.1346 - val_loss: 0.0403 - val_mean_absolute_error: 0.1397
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0340 - mean_absolute_error: 0.1183 - val_loss: 0.0352 - val_mean_absolute_error: 0.1234
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0284 - mean_absolute_error: 0.1119 - val_loss: 0.0454 - val_mean_absolute_error: 0.1595
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0417 - mean_absolute_error: 0.1327 - val_loss: 0.0422 - val_mean_absolute_error: 0.1428
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0264 - mean_absolute_error: 0.0996 - val_loss: 0.0379 - val_mean_absolute_error: 0.1356
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0296 - mean_absolute_error: 0.1115 - val_loss: 0.0435 - val_mean_absolute_error: 0.1540
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0678 - mean_absolute_error: 0.1687 - val_loss: 0.0473 - val_mean_absolute_error: 0.1629
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0460 - mean_absolute_error: 0.1330 - val_loss: 0.0520 - val_mean_absolute_error: 0.1795
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0803 - mean_absolute_error: 0.1938 - val_loss: 0.0650 - val_mean_absolute_error: 0.1996
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0712 - mean_absolute_error: 0.1741 - val_loss: 0.0436 - val_mean_absolute_error: 0.1410
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0503 - mean_absolute_error: 0.1593 - val_loss: 0.0384 - val_mean_absolute_error: 0.1432
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0494 - mean_absolute_error: 0.1380 - val_loss: 0.0650 - val_mean_absolute_error: 0.1884
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0495 - mean_absolute_error: 0.1536 - val_loss: 0.0477 - val_mean_absolute_error: 0.1549
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0472 - mean_absolute_error: 0.1371 - val_loss: 0.0601 - val_mean_absolute_error: 0.1782
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0427 - mean_absolute_error: 0.1174 - val_loss: 0.0693 - val_mean_absolute_error: 0.1981
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0483 - mean_absolute_error: 0.1466 - val_loss: 0.0592 - val_mean_absolute_error: 0.1872
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0378 - mean_absolute_error: 0.1366 - val_loss: 0.0511 - val_mean_absolute_error: 0.1699
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0455 - mean_absolute_error: 0.1281 - val_loss: 0.0571 - val_mean_absolute_error: 0.1749
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0363 - mean_absolute_error: 0.1137 - val_loss: 0.0476 - val_mean_absolute_error: 0.1588
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0270 - mean_absolute_error: 0.0964 - val_loss: 0.0443 - val_mean_absolute_error: 0.1507
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.0515 - mean_absolute_error: 0.1277 - val_loss: 0.0504 - val_mean_absolute_error: 0.1633

FOLD 5
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - loss: 0.0312 - mean_absolute_error: 0.1117 - val_loss: 0.0352 - val_mean_absolute_error: 0.1322
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0316 - mean_absolute_error: 0.1153 - val_loss: 0.0243 - val_mean_absolute_error: 0.1067
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0332 - mean_absolute_error: 0.1139 - val_loss: 0.0510 - val_mean_absolute_error: 0.1516
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0419 - mean_absolute_error: 0.1407 - val_loss: 0.0322 - val_mean_absolute_error: 0.1278
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0329 - mean_absolute_error: 0.1112 - val_loss: 0.0366 - val_mean_absolute_error: 0.1338
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0295 - mean_absolute_error: 0.1138 - val_loss: 0.0299 - val_mean_absolute_error: 0.1180
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0327 - mean_absolute_error: 0.1197 - val_loss: 0.0249 - val_mean_absolute_error: 0.0984
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0392 - mean_absolute_error: 0.1165 - val_loss: 0.0416 - val_mean_absolute_error: 0.1481
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0297 - mean_absolute_error: 0.1105 - val_loss: 0.0600 - val_mean_absolute_error: 0.1948
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0313 - mean_absolute_error: 0.1155 - val_loss: 0.0166 - val_mean_absolute_error: 0.0765
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0410 - mean_absolute_error: 0.1313 - val_loss: 0.0111 - val_mean_absolute_error: 0.0468
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0233 - mean_absolute_error: 0.0925 - val_loss: 0.0359 - val_mean_absolute_error: 0.1413
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0363 - mean_absolute_error: 0.1221 - val_loss: 0.0214 - val_mean_absolute_error: 0.1016
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0330 - mean_absolute_error: 0.1166 - val_loss: 0.0314 - val_mean_absolute_error: 0.1363
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0322 - mean_absolute_error: 0.1169 - val_loss: 0.0433 - val_mean_absolute_error: 0.1522
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0468 - mean_absolute_error: 0.1240 - val_loss: 0.0461 - val_mean_absolute_error: 0.1581
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0326 - mean_absolute_error: 0.1129 - val_loss: 0.0385 - val_mean_absolute_error: 0.1450
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0483 - mean_absolute_error: 0.1403 - val_loss: 0.0338 - val_mean_absolute_error: 0.1342
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0437 - mean_absolute_error: 0.1472 - val_loss: 0.0389 - val_mean_absolute_error: 0.1424
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0259 - mean_absolute_error: 0.1045 - val_loss: 0.0293 - val_mean_absolute_error: 0.1220
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0359 - mean_absolute_error: 0.1278 - val_loss: 0.0337 - val_mean_absolute_error: 0.1335
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0369 - mean_absolute_error: 0.1317 - val_loss: 0.0356 - val_mean_absolute_error: 0.1237
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0365 - mean_absolute_error: 0.1237 - val_loss: 0.0431 - val_mean_absolute_error: 0.1391
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0357 - mean_absolute_error: 0.1080 - val_loss: 0.0280 - val_mean_absolute_error: 0.1153
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0291 - mean_absolute_error: 0.1078 - val_loss: 0.0177 - val_mean_absolute_error: 0.0834
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0293 - mean_absolute_error: 0.1019 - val_loss: 0.0278 - val_mean_absolute_error: 0.1245
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.0288 - mean_absolute_error: 0.1040 - val_loss: 0.0323 - val_mean_absolute_error: 0.1293
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.0261 - mean_absolute_error: 0.1016 - val_loss: 0.0287 - val_mean_absolute_error: 0.1205
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0234 - mean_absolute_error: 0.0934 - val_loss: 0.0263 - val_mean_absolute_error: 0.1087
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.0186 - mean_absolute_error: 0.0829 - val_loss: 0.0321 - val_mean_absolute_error: 0.1238
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.0341 - mean_absolute_error: 0.0950 - val_loss: 0.0191 - val_mean_absolute_error: 0.0855
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0250 - mean_absolute_error: 0.0998 - val_loss: 0.0165 - val_mean_absolute_error: 0.0783
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0250 - mean_absolute_error: 0.0978 - val_loss: 0.0329 - val_mean_absolute_error: 0.1213
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0262 - mean_absolute_error: 0.0937 - val_loss: 0.0225 - val_mean_absolute_error: 0.1000
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0268 - mean_absolute_error: 0.0972 - val_loss: 0.0263 - val_mean_absolute_error: 0.1124
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - loss: 0.0181 - mean_absolute_error: 0.0818 - val_loss: 0.0540 - val_mean_absolute_error: 0.1874
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0280 - mean_absolute_error: 0.1063 - val_loss: 0.0396 - val_mean_absolute_error: 0.1508
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - loss: 0.0266 - mean_absolute_error: 0.1072 - val_loss: 0.0211 - val_mean_absolute_error: 0.1044
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0317 - mean_absolute_error: 0.1124 - val_loss: 0.0292 - val_mean_absolute_error: 0.1289
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0219 - mean_absolute_error: 0.0959 - val_loss: 0.0421 - val_mean_absolute_error: 0.1525
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.0199 - mean_absolute_error: 0.0840 - val_loss: 0.0245 - val_mean_absolute_error: 0.1159
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0242 - mean_absolute_error: 0.0959 - val_loss: 0.0291 - val_mean_absolute_error: 0.1277
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0405 - mean_absolute_error: 0.1116 - val_loss: 0.0356 - val_mean_absolute_error: 0.1354
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0314 - mean_absolute_error: 0.1144 - val_loss: 0.0656 - val_mean_absolute_error: 0.1872
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.0461 - mean_absolute_error: 0.1531 - val_loss: 0.0193 - val_mean_absolute_error: 0.0907
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0374 - mean_absolute_error: 0.1353 - val_loss: 0.0330 - val_mean_absolute_error: 0.1367
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0387 - mean_absolute_error: 0.1386 - val_loss: 0.0356 - val_mean_absolute_error: 0.1486
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0306 - mean_absolute_error: 0.1246 - val_loss: 0.0268 - val_mean_absolute_error: 0.1034
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0434 - mean_absolute_error: 0.1077 - val_loss: 0.0320 - val_mean_absolute_error: 0.1151
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0236 - mean_absolute_error: 0.0987 - val_loss: 0.0231 - val_mean_absolute_error: 0.0973
Validation losses: [0.1559898853302002, 0.0973275676369667, 0.093429796397686, 0.050443004816770554, 0.023146633058786392]
HPS: {'player_emb_dim': 32, 'dense_units': 64, 'dense_units_2': 128, 'learning_rate': 0.01, 'dropout_rate': 0.4, 'dropout_rate_2': 0.2, 'dropout_rate_inter': 0.30000000000000004, 'interaction_scale': 2}. MSE during RandomSearch: 0.12653082609176636. Starting evaluation across all k folds...

FOLD 1
Epoch 1/50
2025-08-09 17:09:35.533417: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:09:35.533784: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
3/3 ━━━━━━━━━━━━━━━━━━━━ 2s 92ms/step - loss: 1.1749 - mean_absolute_error: 0.8884 - val_loss: 1.1482 - val_mean_absolute_error: 0.9089
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 1.3473 - mean_absolute_error: 0.9620 - val_loss: 1.1479 - val_mean_absolute_error: 0.7800
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 1.1623 - mean_absolute_error: 0.8524 - val_loss: 0.9923 - val_mean_absolute_error: 0.7475
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.8257 - mean_absolute_error: 0.7255 - val_loss: 0.7370 - val_mean_absolute_error: 0.6580
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.5841 - mean_absolute_error: 0.5718 - val_loss: 0.6995 - val_mean_absolute_error: 0.6444
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.5369 - mean_absolute_error: 0.5387 - val_loss: 0.5607 - val_mean_absolute_error: 0.5734
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.4644 - mean_absolute_error: 0.5227 - val_loss: 0.5586 - val_mean_absolute_error: 0.6026
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.4568 - mean_absolute_error: 0.5278 - val_loss: 0.4413 - val_mean_absolute_error: 0.5659
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.2833 - mean_absolute_error: 0.3908 - val_loss: 0.3527 - val_mean_absolute_error: 0.5086
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2933 - mean_absolute_error: 0.4179 - val_loss: 0.3311 - val_mean_absolute_error: 0.4755
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.4135 - mean_absolute_error: 0.4815 - val_loss: 0.3365 - val_mean_absolute_error: 0.4763
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.3578 - mean_absolute_error: 0.3989 - val_loss: 0.4979 - val_mean_absolute_error: 0.5587
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.2784 - mean_absolute_error: 0.3912 - val_loss: 0.3504 - val_mean_absolute_error: 0.4622
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1918 - mean_absolute_error: 0.3240 - val_loss: 0.3143 - val_mean_absolute_error: 0.4409
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.2559 - mean_absolute_error: 0.3879 - val_loss: 0.4045 - val_mean_absolute_error: 0.5004
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.2830 - mean_absolute_error: 0.3709 - val_loss: 0.3455 - val_mean_absolute_error: 0.4763
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1995 - mean_absolute_error: 0.3257 - val_loss: 0.2540 - val_mean_absolute_error: 0.4003
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.3162 - mean_absolute_error: 0.3769 - val_loss: 0.2710 - val_mean_absolute_error: 0.4179
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.2367 - mean_absolute_error: 0.3705 - val_loss: 0.3837 - val_mean_absolute_error: 0.4799
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.2083 - mean_absolute_error: 0.3599 - val_loss: 0.3109 - val_mean_absolute_error: 0.4308
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2390 - mean_absolute_error: 0.3385 - val_loss: 0.2111 - val_mean_absolute_error: 0.3393
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.2241 - mean_absolute_error: 0.3418 - val_loss: 0.2301 - val_mean_absolute_error: 0.3708
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1482 - mean_absolute_error: 0.2721 - val_loss: 0.3293 - val_mean_absolute_error: 0.4477
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.2111 - mean_absolute_error: 0.3390 - val_loss: 0.3563 - val_mean_absolute_error: 0.4677
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1652 - mean_absolute_error: 0.3016 - val_loss: 0.2481 - val_mean_absolute_error: 0.3969
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1653 - mean_absolute_error: 0.2769 - val_loss: 0.2550 - val_mean_absolute_error: 0.4041
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1655 - mean_absolute_error: 0.2872 - val_loss: 0.2379 - val_mean_absolute_error: 0.3861
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1414 - mean_absolute_error: 0.2663 - val_loss: 0.2228 - val_mean_absolute_error: 0.3668
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1741 - mean_absolute_error: 0.3061 - val_loss: 0.3485 - val_mean_absolute_error: 0.4770
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1793 - mean_absolute_error: 0.3217 - val_loss: 0.3608 - val_mean_absolute_error: 0.4824
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.2121 - mean_absolute_error: 0.3280 - val_loss: 0.2228 - val_mean_absolute_error: 0.3842
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2391 - mean_absolute_error: 0.3606 - val_loss: 0.2468 - val_mean_absolute_error: 0.3943
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2351 - mean_absolute_error: 0.3391 - val_loss: 0.3814 - val_mean_absolute_error: 0.4732
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2660 - mean_absolute_error: 0.3692 - val_loss: 0.2888 - val_mean_absolute_error: 0.4107
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1391 - mean_absolute_error: 0.2470 - val_loss: 0.2085 - val_mean_absolute_error: 0.3491
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1637 - mean_absolute_error: 0.2903 - val_loss: 0.2235 - val_mean_absolute_error: 0.3603
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1687 - mean_absolute_error: 0.2695 - val_loss: 0.3263 - val_mean_absolute_error: 0.4485
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2027 - mean_absolute_error: 0.3278 - val_loss: 0.3022 - val_mean_absolute_error: 0.4257
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2192 - mean_absolute_error: 0.3312 - val_loss: 0.2417 - val_mean_absolute_error: 0.3864
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1508 - mean_absolute_error: 0.2827 - val_loss: 0.2478 - val_mean_absolute_error: 0.4183
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1886 - mean_absolute_error: 0.3140 - val_loss: 0.2788 - val_mean_absolute_error: 0.4210
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1460 - mean_absolute_error: 0.2659 - val_loss: 0.4124 - val_mean_absolute_error: 0.4933
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1837 - mean_absolute_error: 0.2924 - val_loss: 0.3103 - val_mean_absolute_error: 0.4268
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1625 - mean_absolute_error: 0.2674 - val_loss: 0.2179 - val_mean_absolute_error: 0.3636
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1579 - mean_absolute_error: 0.2704 - val_loss: 0.3343 - val_mean_absolute_error: 0.4475
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1365 - mean_absolute_error: 0.2547 - val_loss: 0.2890 - val_mean_absolute_error: 0.4211
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1144 - mean_absolute_error: 0.2439 - val_loss: 0.2654 - val_mean_absolute_error: 0.4121
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1191 - mean_absolute_error: 0.2403 - val_loss: 0.3606 - val_mean_absolute_error: 0.4792
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1424 - mean_absolute_error: 0.2712 - val_loss: 0.3099 - val_mean_absolute_error: 0.4529
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.2246 - mean_absolute_error: 0.2983 - val_loss: 0.2687 - val_mean_absolute_error: 0.4321

FOLD 2
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.1487 - mean_absolute_error: 0.2671 - val_loss: 0.2302 - val_mean_absolute_error: 0.3623
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1604 - mean_absolute_error: 0.2725 - val_loss: 0.3566 - val_mean_absolute_error: 0.4674
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1524 - mean_absolute_error: 0.2529 - val_loss: 0.2550 - val_mean_absolute_error: 0.3896
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1493 - mean_absolute_error: 0.2676 - val_loss: 0.2941 - val_mean_absolute_error: 0.3990
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2197 - mean_absolute_error: 0.3473 - val_loss: 0.3758 - val_mean_absolute_error: 0.4818
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1804 - mean_absolute_error: 0.2990 - val_loss: 0.2941 - val_mean_absolute_error: 0.4343
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1951 - mean_absolute_error: 0.3313 - val_loss: 0.2427 - val_mean_absolute_error: 0.3825
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1485 - mean_absolute_error: 0.2697 - val_loss: 0.3250 - val_mean_absolute_error: 0.4448
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1192 - mean_absolute_error: 0.2331 - val_loss: 0.3463 - val_mean_absolute_error: 0.4596
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1526 - mean_absolute_error: 0.2749 - val_loss: 0.3967 - val_mean_absolute_error: 0.4954
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1630 - mean_absolute_error: 0.2792 - val_loss: 0.3036 - val_mean_absolute_error: 0.4331
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1975 - mean_absolute_error: 0.2892 - val_loss: 0.3741 - val_mean_absolute_error: 0.4751
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1342 - mean_absolute_error: 0.2648 - val_loss: 0.3874 - val_mean_absolute_error: 0.4853
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1126 - mean_absolute_error: 0.2351 - val_loss: 0.2303 - val_mean_absolute_error: 0.3623
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1621 - mean_absolute_error: 0.2826 - val_loss: 0.2422 - val_mean_absolute_error: 0.3730
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1654 - mean_absolute_error: 0.3048 - val_loss: 0.2770 - val_mean_absolute_error: 0.4085
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0949 - mean_absolute_error: 0.2127 - val_loss: 0.2927 - val_mean_absolute_error: 0.4103
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1049 - mean_absolute_error: 0.2284 - val_loss: 0.3063 - val_mean_absolute_error: 0.4242
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1535 - mean_absolute_error: 0.2685 - val_loss: 0.2552 - val_mean_absolute_error: 0.3788
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1137 - mean_absolute_error: 0.2158 - val_loss: 0.2877 - val_mean_absolute_error: 0.3974
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0954 - mean_absolute_error: 0.2081 - val_loss: 0.3092 - val_mean_absolute_error: 0.4011
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1160 - mean_absolute_error: 0.2130 - val_loss: 0.3562 - val_mean_absolute_error: 0.4455
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0977 - mean_absolute_error: 0.2180 - val_loss: 0.3777 - val_mean_absolute_error: 0.4570
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1111 - mean_absolute_error: 0.2196 - val_loss: 0.3254 - val_mean_absolute_error: 0.4079
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1072 - mean_absolute_error: 0.2140 - val_loss: 0.2953 - val_mean_absolute_error: 0.4065
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0998 - mean_absolute_error: 0.2065 - val_loss: 0.3303 - val_mean_absolute_error: 0.4292
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1281 - mean_absolute_error: 0.2206 - val_loss: 0.3446 - val_mean_absolute_error: 0.4202
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.1069 - mean_absolute_error: 0.2071 - val_loss: 0.3010 - val_mean_absolute_error: 0.3931
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0684 - mean_absolute_error: 0.1760 - val_loss: 0.3189 - val_mean_absolute_error: 0.4223
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0750 - mean_absolute_error: 0.1827 - val_loss: 0.3292 - val_mean_absolute_error: 0.4191
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.1140 - mean_absolute_error: 0.2271 - val_loss: 0.2852 - val_mean_absolute_error: 0.3896
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1140 - mean_absolute_error: 0.2335 - val_loss: 0.3005 - val_mean_absolute_error: 0.4105
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 103ms/step - loss: 0.1139 - mean_absolute_error: 0.2120 - val_loss: 0.2937 - val_mean_absolute_error: 0.3949
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.0726 - mean_absolute_error: 0.1888 - val_loss: 0.2817 - val_mean_absolute_error: 0.3553
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1704 - mean_absolute_error: 0.2477 - val_loss: 0.3341 - val_mean_absolute_error: 0.4280
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1197 - mean_absolute_error: 0.2621 - val_loss: 0.3348 - val_mean_absolute_error: 0.4269
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1515 - mean_absolute_error: 0.2648 - val_loss: 0.3033 - val_mean_absolute_error: 0.3697
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1191 - mean_absolute_error: 0.2311 - val_loss: 0.3767 - val_mean_absolute_error: 0.4631
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1156 - mean_absolute_error: 0.2156 - val_loss: 0.4075 - val_mean_absolute_error: 0.4896
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1364 - mean_absolute_error: 0.2350 - val_loss: 0.2724 - val_mean_absolute_error: 0.3684
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1194 - mean_absolute_error: 0.2330 - val_loss: 0.2549 - val_mean_absolute_error: 0.3487
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1034 - mean_absolute_error: 0.2215 - val_loss: 0.4347 - val_mean_absolute_error: 0.4829
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1330 - mean_absolute_error: 0.2314 - val_loss: 0.3643 - val_mean_absolute_error: 0.4305
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0772 - mean_absolute_error: 0.1790 - val_loss: 0.2109 - val_mean_absolute_error: 0.3039
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1461 - mean_absolute_error: 0.2604 - val_loss: 0.3963 - val_mean_absolute_error: 0.4329
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.1122 - mean_absolute_error: 0.2266 - val_loss: 0.4836 - val_mean_absolute_error: 0.4921
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0933 - mean_absolute_error: 0.2171 - val_loss: 0.3545 - val_mean_absolute_error: 0.4307
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0920 - mean_absolute_error: 0.2029 - val_loss: 0.2425 - val_mean_absolute_error: 0.3187
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0924 - mean_absolute_error: 0.2029 - val_loss: 0.3447 - val_mean_absolute_error: 0.3969
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0884 - mean_absolute_error: 0.2039 - val_loss: 0.4360 - val_mean_absolute_error: 0.4700

FOLD 3
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.1681 - mean_absolute_error: 0.2742 - val_loss: 0.1190 - val_mean_absolute_error: 0.2453
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1833 - mean_absolute_error: 0.2907 - val_loss: 0.0733 - val_mean_absolute_error: 0.1833
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1513 - mean_absolute_error: 0.2648 - val_loss: 0.1448 - val_mean_absolute_error: 0.2584
Epoch 4/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1193 - mean_absolute_error: 0.2577
2025-08-09 17:09:42.287335: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:09:42.287652: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1203 - mean_absolute_error: 0.2520 - val_loss: 0.1113 - val_mean_absolute_error: 0.2451
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1691 - mean_absolute_error: 0.3019 - val_loss: 0.1199 - val_mean_absolute_error: 0.2472
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1381 - mean_absolute_error: 0.2416 - val_loss: 0.0997 - val_mean_absolute_error: 0.2042
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1280 - mean_absolute_error: 0.2540 - val_loss: 0.0679 - val_mean_absolute_error: 0.1722
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2029 - mean_absolute_error: 0.2980 - val_loss: 0.2598 - val_mean_absolute_error: 0.3952
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2075 - mean_absolute_error: 0.3323 - val_loss: 0.2550 - val_mean_absolute_error: 0.3756
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2009 - mean_absolute_error: 0.3133 - val_loss: 0.0599 - val_mean_absolute_error: 0.1772
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1943 - mean_absolute_error: 0.2897 - val_loss: 0.0717 - val_mean_absolute_error: 0.1893
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1182 - mean_absolute_error: 0.2230 - val_loss: 0.2323 - val_mean_absolute_error: 0.3698
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1976 - mean_absolute_error: 0.3264 - val_loss: 0.1567 - val_mean_absolute_error: 0.2867
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1522 - mean_absolute_error: 0.2562 - val_loss: 0.1229 - val_mean_absolute_error: 0.2416
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1725 - mean_absolute_error: 0.2913 - val_loss: 0.0753 - val_mean_absolute_error: 0.2009
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1275 - mean_absolute_error: 0.2540 - val_loss: 0.1959 - val_mean_absolute_error: 0.3452
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2280 - mean_absolute_error: 0.3363 - val_loss: 0.1969 - val_mean_absolute_error: 0.3456
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1650 - mean_absolute_error: 0.2944 - val_loss: 0.0736 - val_mean_absolute_error: 0.2047
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.1499 - mean_absolute_error: 0.2900 - val_loss: 0.0626 - val_mean_absolute_error: 0.1785
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1389 - mean_absolute_error: 0.2242 - val_loss: 0.1469 - val_mean_absolute_error: 0.3007
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1068 - mean_absolute_error: 0.2443 - val_loss: 0.1596 - val_mean_absolute_error: 0.3002
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2068 - mean_absolute_error: 0.2999 - val_loss: 0.1819 - val_mean_absolute_error: 0.3430
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1887 - mean_absolute_error: 0.3140 - val_loss: 0.2634 - val_mean_absolute_error: 0.3873
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.2199 - mean_absolute_error: 0.3227 - val_loss: 0.1308 - val_mean_absolute_error: 0.2517
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1430 - mean_absolute_error: 0.2697 - val_loss: 0.0968 - val_mean_absolute_error: 0.2101
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1972 - mean_absolute_error: 0.3246 - val_loss: 0.1665 - val_mean_absolute_error: 0.3220
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1559 - mean_absolute_error: 0.2867 - val_loss: 0.2870 - val_mean_absolute_error: 0.4445
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1666 - mean_absolute_error: 0.2940 - val_loss: 0.1639 - val_mean_absolute_error: 0.2972
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 58ms/step - loss: 0.1445 - mean_absolute_error: 0.2660 - val_loss: 0.1099 - val_mean_absolute_error: 0.2398
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2265 - mean_absolute_error: 0.3069 - val_loss: 0.2593 - val_mean_absolute_error: 0.4027
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2024 - mean_absolute_error: 0.3378 - val_loss: 0.4575 - val_mean_absolute_error: 0.5407
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.3091 - mean_absolute_error: 0.3961 - val_loss: 0.2939 - val_mean_absolute_error: 0.4329
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.2028 - mean_absolute_error: 0.2999 - val_loss: 0.1346 - val_mean_absolute_error: 0.2786
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2615 - mean_absolute_error: 0.3590 - val_loss: 0.1416 - val_mean_absolute_error: 0.2927
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1655 - mean_absolute_error: 0.2810 - val_loss: 0.2516 - val_mean_absolute_error: 0.4052
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2038 - mean_absolute_error: 0.3044 - val_loss: 0.2778 - val_mean_absolute_error: 0.4275
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2478 - mean_absolute_error: 0.3510 - val_loss: 0.1754 - val_mean_absolute_error: 0.3373
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.2194 - mean_absolute_error: 0.3310 - val_loss: 0.1048 - val_mean_absolute_error: 0.2389
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1898 - mean_absolute_error: 0.3134 - val_loss: 0.1956 - val_mean_absolute_error: 0.3506
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1455 - mean_absolute_error: 0.2646 - val_loss: 0.2578 - val_mean_absolute_error: 0.4035
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1542 - mean_absolute_error: 0.2627 - val_loss: 0.2056 - val_mean_absolute_error: 0.3628
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1699 - mean_absolute_error: 0.2776 - val_loss: 0.1737 - val_mean_absolute_error: 0.3428
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.2029 - mean_absolute_error: 0.3113 - val_loss: 0.1598 - val_mean_absolute_error: 0.3183
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1882 - mean_absolute_error: 0.2926 - val_loss: 0.1889 - val_mean_absolute_error: 0.3540
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1432 - mean_absolute_error: 0.2451 - val_loss: 0.1478 - val_mean_absolute_error: 0.3119
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1708 - mean_absolute_error: 0.3053 - val_loss: 0.1457 - val_mean_absolute_error: 0.3215
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1551 - mean_absolute_error: 0.2649 - val_loss: 0.1666 - val_mean_absolute_error: 0.3293
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1556 - mean_absolute_error: 0.2862 - val_loss: 0.1631 - val_mean_absolute_error: 0.3143
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1507 - mean_absolute_error: 0.2740 - val_loss: 0.1578 - val_mean_absolute_error: 0.3079
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0974 - mean_absolute_error: 0.2199 - val_loss: 0.1675 - val_mean_absolute_error: 0.3280

FOLD 4
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step - loss: 0.1355 - mean_absolute_error: 0.2598 - val_loss: 0.0746 - val_mean_absolute_error: 0.1835
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1384 - mean_absolute_error: 0.2732 - val_loss: 0.1513 - val_mean_absolute_error: 0.3139
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1787 - mean_absolute_error: 0.3014 - val_loss: 0.1794 - val_mean_absolute_error: 0.3417
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1845 - mean_absolute_error: 0.3100 - val_loss: 0.1657 - val_mean_absolute_error: 0.3067
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1489 - mean_absolute_error: 0.2842 - val_loss: 0.1943 - val_mean_absolute_error: 0.3301
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.1996 - mean_absolute_error: 0.3169 - val_loss: 0.1313 - val_mean_absolute_error: 0.2785
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1805 - mean_absolute_error: 0.3028 - val_loss: 0.2089 - val_mean_absolute_error: 0.3561
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1685 - mean_absolute_error: 0.2897 - val_loss: 0.1714 - val_mean_absolute_error: 0.3247
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2177 - mean_absolute_error: 0.3339 - val_loss: 0.1847 - val_mean_absolute_error: 0.3373
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1750 - mean_absolute_error: 0.2989 - val_loss: 0.1537 - val_mean_absolute_error: 0.2852
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2294 - mean_absolute_error: 0.3512 - val_loss: 0.0810 - val_mean_absolute_error: 0.2042
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.2508 - mean_absolute_error: 0.3575 - val_loss: 0.1291 - val_mean_absolute_error: 0.2851
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.2469 - mean_absolute_error: 0.3616 - val_loss: 0.1090 - val_mean_absolute_error: 0.2381
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.3120 - mean_absolute_error: 0.3950 - val_loss: 0.0961 - val_mean_absolute_error: 0.2126
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1479 - mean_absolute_error: 0.2480 - val_loss: 0.1522 - val_mean_absolute_error: 0.2889
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1977 - mean_absolute_error: 0.3322 - val_loss: 0.0889 - val_mean_absolute_error: 0.2241
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.2289 - mean_absolute_error: 0.3381 - val_loss: 0.1298 - val_mean_absolute_error: 0.2677
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1504 - mean_absolute_error: 0.2650 - val_loss: 0.1539 - val_mean_absolute_error: 0.3058
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1537 - mean_absolute_error: 0.2797 - val_loss: 0.1377 - val_mean_absolute_error: 0.2948
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.2637 - mean_absolute_error: 0.3023 - val_loss: 0.1395 - val_mean_absolute_error: 0.2880
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1797 - mean_absolute_error: 0.2950 - val_loss: 0.1237 - val_mean_absolute_error: 0.2682
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.2974 - mean_absolute_error: 0.3675 - val_loss: 0.1301 - val_mean_absolute_error: 0.2803
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1365 - mean_absolute_error: 0.2715 - val_loss: 0.1175 - val_mean_absolute_error: 0.2521
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1668 - mean_absolute_error: 0.2772 - val_loss: 0.1382 - val_mean_absolute_error: 0.2649
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1544 - mean_absolute_error: 0.2952 - val_loss: 0.1230 - val_mean_absolute_error: 0.2524
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1628 - mean_absolute_error: 0.2985 - val_loss: 0.1502 - val_mean_absolute_error: 0.2735
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1041 - mean_absolute_error: 0.2129 - val_loss: 0.1378 - val_mean_absolute_error: 0.2695
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.2882 - mean_absolute_error: 0.3846 - val_loss: 0.1096 - val_mean_absolute_error: 0.2391
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.2122 - mean_absolute_error: 0.3193 - val_loss: 0.0951 - val_mean_absolute_error: 0.2252
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1789 - mean_absolute_error: 0.3090 - val_loss: 0.1282 - val_mean_absolute_error: 0.2525
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1561 - mean_absolute_error: 0.2896 - val_loss: 0.1475 - val_mean_absolute_error: 0.2691
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1373 - mean_absolute_error: 0.2518 - val_loss: 0.1527 - val_mean_absolute_error: 0.3011
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.1475 - mean_absolute_error: 0.2796 - val_loss: 0.1216 - val_mean_absolute_error: 0.2635
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1422 - mean_absolute_error: 0.2622 - val_loss: 0.1414 - val_mean_absolute_error: 0.2761
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1821 - mean_absolute_error: 0.3111 - val_loss: 0.1686 - val_mean_absolute_error: 0.2969
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1349 - mean_absolute_error: 0.2678 - val_loss: 0.1259 - val_mean_absolute_error: 0.2633
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1391 - mean_absolute_error: 0.2460 - val_loss: 0.1078 - val_mean_absolute_error: 0.2421
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.2276 - mean_absolute_error: 0.3315 - val_loss: 0.1328 - val_mean_absolute_error: 0.2525
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.2053 - mean_absolute_error: 0.2918 - val_loss: 0.1259 - val_mean_absolute_error: 0.2702
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1916 - mean_absolute_error: 0.2936 - val_loss: 0.1477 - val_mean_absolute_error: 0.2696
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1310 - mean_absolute_error: 0.2637 - val_loss: 0.1743 - val_mean_absolute_error: 0.2877
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1263 - mean_absolute_error: 0.2540 - val_loss: 0.1540 - val_mean_absolute_error: 0.2889
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1556 - mean_absolute_error: 0.2785 - val_loss: 0.1689 - val_mean_absolute_error: 0.2944
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1126 - mean_absolute_error: 0.2352 - val_loss: 0.1748 - val_mean_absolute_error: 0.2778
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.2375 - mean_absolute_error: 0.3176 - val_loss: 0.1640 - val_mean_absolute_error: 0.2915
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1242 - mean_absolute_error: 0.2519 - val_loss: 0.1799 - val_mean_absolute_error: 0.3176
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.2383 - mean_absolute_error: 0.3330 - val_loss: 0.1397 - val_mean_absolute_error: 0.2645
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1338 - mean_absolute_error: 0.2457 - val_loss: 0.1261 - val_mean_absolute_error: 0.2500
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1425 - mean_absolute_error: 0.2418 - val_loss: 0.1667 - val_mean_absolute_error: 0.3105
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.2077 - mean_absolute_error: 0.3329 - val_loss: 0.1269 - val_mean_absolute_error: 0.2722

FOLD 5
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - loss: 0.1566 - mean_absolute_error: 0.2595 - val_loss: 0.1769 - val_mean_absolute_error: 0.3421
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1709 - mean_absolute_error: 0.2843 - val_loss: 0.3051 - val_mean_absolute_error: 0.4545
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1828 - mean_absolute_error: 0.3023 - val_loss: 0.3406 - val_mean_absolute_error: 0.4618
Epoch 4/50
1/3 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.1506 - mean_absolute_error: 0.2862
2025-08-09 17:09:47.759552: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:09:47.759958: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1692 - mean_absolute_error: 0.2974 - val_loss: 0.2080 - val_mean_absolute_error: 0.3648
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1157 - mean_absolute_error: 0.2516 - val_loss: 0.0993 - val_mean_absolute_error: 0.2351
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1558 - mean_absolute_error: 0.2580 - val_loss: 0.1494 - val_mean_absolute_error: 0.2867
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1697 - mean_absolute_error: 0.3059 - val_loss: 0.2631 - val_mean_absolute_error: 0.3998
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0825 - mean_absolute_error: 0.1896 - val_loss: 0.2105 - val_mean_absolute_error: 0.3583
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0983 - mean_absolute_error: 0.2112 - val_loss: 0.1741 - val_mean_absolute_error: 0.3068
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0865 - mean_absolute_error: 0.1935 - val_loss: 0.1788 - val_mean_absolute_error: 0.3059
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1831 - mean_absolute_error: 0.2460 - val_loss: 0.3115 - val_mean_absolute_error: 0.4545
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.2645 - mean_absolute_error: 0.2840 - val_loss: 0.3268 - val_mean_absolute_error: 0.4659
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1560 - mean_absolute_error: 0.2642 - val_loss: 0.2096 - val_mean_absolute_error: 0.3561
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.1823 - mean_absolute_error: 0.2607 - val_loss: 0.1664 - val_mean_absolute_error: 0.3168
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0983 - mean_absolute_error: 0.2193 - val_loss: 0.2170 - val_mean_absolute_error: 0.3774
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0829 - mean_absolute_error: 0.1984 - val_loss: 0.2529 - val_mean_absolute_error: 0.4132
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1228 - mean_absolute_error: 0.2399 - val_loss: 0.2557 - val_mean_absolute_error: 0.4299
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1103 - mean_absolute_error: 0.2184 - val_loss: 0.2020 - val_mean_absolute_error: 0.3764
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1258 - mean_absolute_error: 0.2439 - val_loss: 0.1577 - val_mean_absolute_error: 0.3128
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1861 - mean_absolute_error: 0.2433 - val_loss: 0.2350 - val_mean_absolute_error: 0.3937
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0658 - mean_absolute_error: 0.1661 - val_loss: 0.3380 - val_mean_absolute_error: 0.4947
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1045 - mean_absolute_error: 0.2177 - val_loss: 0.3490 - val_mean_absolute_error: 0.4932
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1077 - mean_absolute_error: 0.2155 - val_loss: 0.2510 - val_mean_absolute_error: 0.4051
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0879 - mean_absolute_error: 0.1949 - val_loss: 0.2010 - val_mean_absolute_error: 0.3429
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1310 - mean_absolute_error: 0.2485 - val_loss: 0.2199 - val_mean_absolute_error: 0.3608
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0941 - mean_absolute_error: 0.1875 - val_loss: 0.2595 - val_mean_absolute_error: 0.4249
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0726 - mean_absolute_error: 0.1905 - val_loss: 0.2906 - val_mean_absolute_error: 0.4562
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1294 - mean_absolute_error: 0.1984 - val_loss: 0.2527 - val_mean_absolute_error: 0.4066
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1333 - mean_absolute_error: 0.2240 - val_loss: 0.2145 - val_mean_absolute_error: 0.3928
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1057 - mean_absolute_error: 0.2259 - val_loss: 0.2224 - val_mean_absolute_error: 0.4032
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0798 - mean_absolute_error: 0.1817 - val_loss: 0.2144 - val_mean_absolute_error: 0.3821
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0805 - mean_absolute_error: 0.1761 - val_loss: 0.2341 - val_mean_absolute_error: 0.3675
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0931 - mean_absolute_error: 0.2068 - val_loss: 0.2390 - val_mean_absolute_error: 0.3852
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0869 - mean_absolute_error: 0.2117 - val_loss: 0.2487 - val_mean_absolute_error: 0.4178
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.1381 - mean_absolute_error: 0.2364 - val_loss: 0.1974 - val_mean_absolute_error: 0.3692
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0873 - mean_absolute_error: 0.2052 - val_loss: 0.2213 - val_mean_absolute_error: 0.3726
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 74ms/step - loss: 0.0989 - mean_absolute_error: 0.2034 - val_loss: 0.2150 - val_mean_absolute_error: 0.3642
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 0.0714 - mean_absolute_error: 0.1662 - val_loss: 0.1896 - val_mean_absolute_error: 0.3444
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0731 - mean_absolute_error: 0.1706 - val_loss: 0.2511 - val_mean_absolute_error: 0.4175
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0815 - mean_absolute_error: 0.1993 - val_loss: 0.2916 - val_mean_absolute_error: 0.4558
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0736 - mean_absolute_error: 0.1692 - val_loss: 0.1923 - val_mean_absolute_error: 0.3687
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0793 - mean_absolute_error: 0.1910 - val_loss: 0.1596 - val_mean_absolute_error: 0.3294
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.1167 - mean_absolute_error: 0.2072 - val_loss: 0.2411 - val_mean_absolute_error: 0.4219
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0693 - mean_absolute_error: 0.1622 - val_loss: 0.2445 - val_mean_absolute_error: 0.4265
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0762 - mean_absolute_error: 0.1821 - val_loss: 0.2234 - val_mean_absolute_error: 0.4034
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1026 - mean_absolute_error: 0.2146 - val_loss: 0.2076 - val_mean_absolute_error: 0.3864
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1160 - mean_absolute_error: 0.2140 - val_loss: 0.2180 - val_mean_absolute_error: 0.3902
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0879 - mean_absolute_error: 0.1855 - val_loss: 0.2999 - val_mean_absolute_error: 0.4596
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0737 - mean_absolute_error: 0.1762 - val_loss: 0.3078 - val_mean_absolute_error: 0.4691
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1002 - mean_absolute_error: 0.2175 - val_loss: 0.2447 - val_mean_absolute_error: 0.4154
Validation losses: [0.26874738931655884, 0.4359869360923767, 0.16746126115322113, 0.12691226601600647, 0.2447347640991211]
HPS: {'player_emb_dim': 32, 'dense_units': 48, 'dense_units_2': 112, 'learning_rate': 0.001, 'dropout_rate': 0.1, 'dropout_rate_2': 0.1, 'dropout_rate_inter': 0.30000000000000004, 'interaction_scale': 3}. MSE during RandomSearch: 0.21696588397026062. Starting evaluation across all k folds...

FOLD 1
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 2s 155ms/step - loss: 1.0195 - mean_absolute_error: 0.8179 - val_loss: 1.0271 - val_mean_absolute_error: 0.7904
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.8639 - mean_absolute_error: 0.7531 - val_loss: 0.9557 - val_mean_absolute_error: 0.7617
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.6710 - mean_absolute_error: 0.6613 - val_loss: 0.8851 - val_mean_absolute_error: 0.7208
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.6181 - mean_absolute_error: 0.6215 - val_loss: 0.7971 - val_mean_absolute_error: 0.6628
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.4959 - mean_absolute_error: 0.5572 - val_loss: 0.7231 - val_mean_absolute_error: 0.6065
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.3164 - mean_absolute_error: 0.4606 - val_loss: 0.6891 - val_mean_absolute_error: 0.6040
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.3454 - mean_absolute_error: 0.4225 - val_loss: 0.6343 - val_mean_absolute_error: 0.5813
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.2649 - mean_absolute_error: 0.3993 - val_loss: 0.6012 - val_mean_absolute_error: 0.5741
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1769 - mean_absolute_error: 0.3186 - val_loss: 0.5771 - val_mean_absolute_error: 0.5528
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1810 - mean_absolute_error: 0.3347 - val_loss: 0.4834 - val_mean_absolute_error: 0.5162
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1849 - mean_absolute_error: 0.3163 - val_loss: 0.4629 - val_mean_absolute_error: 0.5248
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.1456 - mean_absolute_error: 0.2968 - val_loss: 0.5574 - val_mean_absolute_error: 0.5830
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1850 - mean_absolute_error: 0.3353 - val_loss: 0.4816 - val_mean_absolute_error: 0.5566
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1910 - mean_absolute_error: 0.3364 - val_loss: 0.4422 - val_mean_absolute_error: 0.5406
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1251 - mean_absolute_error: 0.2658 - val_loss: 0.5251 - val_mean_absolute_error: 0.5759
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1078 - mean_absolute_error: 0.2290 - val_loss: 0.4664 - val_mean_absolute_error: 0.5506
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 14ms/step - loss: 0.1117 - mean_absolute_error: 0.2573 - val_loss: 0.4005 - val_mean_absolute_error: 0.5235
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0845 - mean_absolute_error: 0.2212 - val_loss: 0.4223 - val_mean_absolute_error: 0.5292
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1211 - mean_absolute_error: 0.2508 - val_loss: 0.4599 - val_mean_absolute_error: 0.5498
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1032 - mean_absolute_error: 0.2345 - val_loss: 0.4268 - val_mean_absolute_error: 0.5326
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0843 - mean_absolute_error: 0.2172 - val_loss: 0.3606 - val_mean_absolute_error: 0.4896
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0828 - mean_absolute_error: 0.2128 - val_loss: 0.3559 - val_mean_absolute_error: 0.4922
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.1022 - mean_absolute_error: 0.2269 - val_loss: 0.3869 - val_mean_absolute_error: 0.5090
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0630 - mean_absolute_error: 0.1776 - val_loss: 0.4240 - val_mean_absolute_error: 0.5318
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0789 - mean_absolute_error: 0.1931 - val_loss: 0.3752 - val_mean_absolute_error: 0.5064
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0803 - mean_absolute_error: 0.1895 - val_loss: 0.3376 - val_mean_absolute_error: 0.4878
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0747 - mean_absolute_error: 0.1966 - val_loss: 0.3577 - val_mean_absolute_error: 0.5069
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0707 - mean_absolute_error: 0.1817 - val_loss: 0.3945 - val_mean_absolute_error: 0.5251
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0753 - mean_absolute_error: 0.1997 - val_loss: 0.4155 - val_mean_absolute_error: 0.5378
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0744 - mean_absolute_error: 0.1770 - val_loss: 0.3646 - val_mean_absolute_error: 0.5069
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0514 - mean_absolute_error: 0.1531 - val_loss: 0.3612 - val_mean_absolute_error: 0.5025
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0762 - mean_absolute_error: 0.2010 - val_loss: 0.3619 - val_mean_absolute_error: 0.4992
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0636 - mean_absolute_error: 0.1571 - val_loss: 0.3346 - val_mean_absolute_error: 0.4793
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0581 - mean_absolute_error: 0.1741 - val_loss: 0.3371 - val_mean_absolute_error: 0.4812
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0833 - mean_absolute_error: 0.2118 - val_loss: 0.3318 - val_mean_absolute_error: 0.4789
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0793 - mean_absolute_error: 0.1838 - val_loss: 0.3242 - val_mean_absolute_error: 0.4757
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0505 - mean_absolute_error: 0.1476 - val_loss: 0.3241 - val_mean_absolute_error: 0.4749
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0605 - mean_absolute_error: 0.1470 - val_loss: 0.3436 - val_mean_absolute_error: 0.4844
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0722 - mean_absolute_error: 0.1988 - val_loss: 0.3697 - val_mean_absolute_error: 0.4946
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0713 - mean_absolute_error: 0.2002 - val_loss: 0.3391 - val_mean_absolute_error: 0.4689
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0508 - mean_absolute_error: 0.1607 - val_loss: 0.3188 - val_mean_absolute_error: 0.4586
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0620 - mean_absolute_error: 0.1516 - val_loss: 0.3411 - val_mean_absolute_error: 0.4713
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0590 - mean_absolute_error: 0.1546 - val_loss: 0.3446 - val_mean_absolute_error: 0.4752
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0573 - mean_absolute_error: 0.1614 - val_loss: 0.3267 - val_mean_absolute_error: 0.4676
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0487 - mean_absolute_error: 0.1505 - val_loss: 0.3110 - val_mean_absolute_error: 0.4554
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0541 - mean_absolute_error: 0.1650 - val_loss: 0.3107 - val_mean_absolute_error: 0.4531
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0672 - mean_absolute_error: 0.1725 - val_loss: 0.3369 - val_mean_absolute_error: 0.4731
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0624 - mean_absolute_error: 0.1532 - val_loss: 0.3180 - val_mean_absolute_error: 0.4644
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0405 - mean_absolute_error: 0.1183 - val_loss: 0.3142 - val_mean_absolute_error: 0.4658
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0674 - mean_absolute_error: 0.1706 - val_loss: 0.3328 - val_mean_absolute_error: 0.4757

FOLD 2
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - loss: 0.1280 - mean_absolute_error: 0.2380 - val_loss: 0.0195 - val_mean_absolute_error: 0.0638
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0982 - mean_absolute_error: 0.2171 - val_loss: 0.0244 - val_mean_absolute_error: 0.0832
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0946 - mean_absolute_error: 0.2090 - val_loss: 0.0353 - val_mean_absolute_error: 0.1068
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0667 - mean_absolute_error: 0.1764 - val_loss: 0.0443 - val_mean_absolute_error: 0.1302
Epoch 5/50
2025-08-09 17:09:54.320558: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:09:54.320878: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0697 - mean_absolute_error: 0.1791 - val_loss: 0.0483 - val_mean_absolute_error: 0.1482
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0688 - mean_absolute_error: 0.1920 - val_loss: 0.0502 - val_mean_absolute_error: 0.1539
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0569 - mean_absolute_error: 0.1610 - val_loss: 0.0531 - val_mean_absolute_error: 0.1591
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0543 - mean_absolute_error: 0.1490 - val_loss: 0.0529 - val_mean_absolute_error: 0.1559
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0684 - mean_absolute_error: 0.1801 - val_loss: 0.0487 - val_mean_absolute_error: 0.1475
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0593 - mean_absolute_error: 0.1668 - val_loss: 0.0443 - val_mean_absolute_error: 0.1370
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0774 - mean_absolute_error: 0.1721 - val_loss: 0.0463 - val_mean_absolute_error: 0.1455
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0505 - mean_absolute_error: 0.1493 - val_loss: 0.0457 - val_mean_absolute_error: 0.1472
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0435 - mean_absolute_error: 0.1372 - val_loss: 0.0469 - val_mean_absolute_error: 0.1502
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0465 - mean_absolute_error: 0.1379 - val_loss: 0.0463 - val_mean_absolute_error: 0.1495
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0497 - mean_absolute_error: 0.1510 - val_loss: 0.0469 - val_mean_absolute_error: 0.1451
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0445 - mean_absolute_error: 0.1385 - val_loss: 0.0485 - val_mean_absolute_error: 0.1382
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0499 - mean_absolute_error: 0.1320 - val_loss: 0.0473 - val_mean_absolute_error: 0.1368
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0521 - mean_absolute_error: 0.1335 - val_loss: 0.0485 - val_mean_absolute_error: 0.1551
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0457 - mean_absolute_error: 0.1332 - val_loss: 0.0509 - val_mean_absolute_error: 0.1635
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0496 - mean_absolute_error: 0.1426 - val_loss: 0.0482 - val_mean_absolute_error: 0.1511
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0562 - mean_absolute_error: 0.1584 - val_loss: 0.0483 - val_mean_absolute_error: 0.1488
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0567 - mean_absolute_error: 0.1542 - val_loss: 0.0528 - val_mean_absolute_error: 0.1601
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0554 - mean_absolute_error: 0.1632 - val_loss: 0.0546 - val_mean_absolute_error: 0.1629
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0418 - mean_absolute_error: 0.1330 - val_loss: 0.0544 - val_mean_absolute_error: 0.1583
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0421 - mean_absolute_error: 0.1333 - val_loss: 0.0554 - val_mean_absolute_error: 0.1526
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0611 - mean_absolute_error: 0.1656 - val_loss: 0.0542 - val_mean_absolute_error: 0.1489
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0422 - mean_absolute_error: 0.1300 - val_loss: 0.0535 - val_mean_absolute_error: 0.1553
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0420 - mean_absolute_error: 0.1292 - val_loss: 0.0551 - val_mean_absolute_error: 0.1577
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0405 - mean_absolute_error: 0.1231 - val_loss: 0.0563 - val_mean_absolute_error: 0.1598
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0430 - mean_absolute_error: 0.1428 - val_loss: 0.0554 - val_mean_absolute_error: 0.1557
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0451 - mean_absolute_error: 0.1391 - val_loss: 0.0632 - val_mean_absolute_error: 0.1701
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0460 - mean_absolute_error: 0.1410 - val_loss: 0.0575 - val_mean_absolute_error: 0.1611
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0440 - mean_absolute_error: 0.1312 - val_loss: 0.0601 - val_mean_absolute_error: 0.1624
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0421 - mean_absolute_error: 0.1351 - val_loss: 0.0604 - val_mean_absolute_error: 0.1645
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0374 - mean_absolute_error: 0.1261 - val_loss: 0.0585 - val_mean_absolute_error: 0.1718
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0508 - mean_absolute_error: 0.1426 - val_loss: 0.0567 - val_mean_absolute_error: 0.1738
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0656 - mean_absolute_error: 0.1725 - val_loss: 0.0558 - val_mean_absolute_error: 0.1680
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0390 - mean_absolute_error: 0.1184 - val_loss: 0.0557 - val_mean_absolute_error: 0.1616
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0403 - mean_absolute_error: 0.1294 - val_loss: 0.0609 - val_mean_absolute_error: 0.1629
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0442 - mean_absolute_error: 0.1410 - val_loss: 0.0633 - val_mean_absolute_error: 0.1681
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0455 - mean_absolute_error: 0.1345 - val_loss: 0.0614 - val_mean_absolute_error: 0.1761
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0454 - mean_absolute_error: 0.1276 - val_loss: 0.0606 - val_mean_absolute_error: 0.1731
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0398 - mean_absolute_error: 0.1253 - val_loss: 0.0617 - val_mean_absolute_error: 0.1710
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0539 - mean_absolute_error: 0.1547 - val_loss: 0.0617 - val_mean_absolute_error: 0.1694
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0446 - mean_absolute_error: 0.1378 - val_loss: 0.0565 - val_mean_absolute_error: 0.1629
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0376 - mean_absolute_error: 0.1089 - val_loss: 0.0551 - val_mean_absolute_error: 0.1679
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0377 - mean_absolute_error: 0.1189 - val_loss: 0.0558 - val_mean_absolute_error: 0.1650
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0443 - mean_absolute_error: 0.1434 - val_loss: 0.0551 - val_mean_absolute_error: 0.1626
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0502 - mean_absolute_error: 0.1562 - val_loss: 0.0543 - val_mean_absolute_error: 0.1588
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0325 - mean_absolute_error: 0.1028 - val_loss: 0.0543 - val_mean_absolute_error: 0.1558

FOLD 3
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - loss: 0.0522 - mean_absolute_error: 0.1523 - val_loss: 0.0206 - val_mean_absolute_error: 0.0762
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0578 - mean_absolute_error: 0.1624 - val_loss: 0.0240 - val_mean_absolute_error: 0.0921
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0497 - mean_absolute_error: 0.1433 - val_loss: 0.0246 - val_mean_absolute_error: 0.0845
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0623 - mean_absolute_error: 0.1782 - val_loss: 0.0234 - val_mean_absolute_error: 0.0894
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0624 - mean_absolute_error: 0.1764 - val_loss: 0.0243 - val_mean_absolute_error: 0.0877
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0440 - mean_absolute_error: 0.1232 - val_loss: 0.0219 - val_mean_absolute_error: 0.0803
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0443 - mean_absolute_error: 0.1267 - val_loss: 0.0216 - val_mean_absolute_error: 0.0822
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0492 - mean_absolute_error: 0.1489 - val_loss: 0.0222 - val_mean_absolute_error: 0.0839
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0517 - mean_absolute_error: 0.1337 - val_loss: 0.0225 - val_mean_absolute_error: 0.0841
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0420 - mean_absolute_error: 0.1347 - val_loss: 0.0236 - val_mean_absolute_error: 0.0889
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0381 - mean_absolute_error: 0.1245 - val_loss: 0.0275 - val_mean_absolute_error: 0.1026
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0515 - mean_absolute_error: 0.1526 - val_loss: 0.0342 - val_mean_absolute_error: 0.1058
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0495 - mean_absolute_error: 0.1459 - val_loss: 0.0305 - val_mean_absolute_error: 0.1028
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0504 - mean_absolute_error: 0.1418 - val_loss: 0.0235 - val_mean_absolute_error: 0.0820
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0532 - mean_absolute_error: 0.1460 - val_loss: 0.0238 - val_mean_absolute_error: 0.0850
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0459 - mean_absolute_error: 0.1406 - val_loss: 0.0229 - val_mean_absolute_error: 0.0846
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0396 - mean_absolute_error: 0.1276 - val_loss: 0.0325 - val_mean_absolute_error: 0.1136
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0456 - mean_absolute_error: 0.1405 - val_loss: 0.0352 - val_mean_absolute_error: 0.1143
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0491 - mean_absolute_error: 0.1483 - val_loss: 0.0282 - val_mean_absolute_error: 0.1023
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0404 - mean_absolute_error: 0.1196 - val_loss: 0.0261 - val_mean_absolute_error: 0.0991
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0520 - mean_absolute_error: 0.1475 - val_loss: 0.0265 - val_mean_absolute_error: 0.1008
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0325 - mean_absolute_error: 0.1089 - val_loss: 0.0278 - val_mean_absolute_error: 0.0996
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0475 - mean_absolute_error: 0.1197 - val_loss: 0.0285 - val_mean_absolute_error: 0.0962
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0417 - mean_absolute_error: 0.1288 - val_loss: 0.0308 - val_mean_absolute_error: 0.1052
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0436 - mean_absolute_error: 0.1343 - val_loss: 0.0231 - val_mean_absolute_error: 0.0815
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0473 - mean_absolute_error: 0.1384 - val_loss: 0.0228 - val_mean_absolute_error: 0.0891
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0497 - mean_absolute_error: 0.1326 - val_loss: 0.0276 - val_mean_absolute_error: 0.0986
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0359 - mean_absolute_error: 0.1198 - val_loss: 0.0251 - val_mean_absolute_error: 0.0955
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0348 - mean_absolute_error: 0.1114 - val_loss: 0.0246 - val_mean_absolute_error: 0.0945
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0310 - mean_absolute_error: 0.0944 - val_loss: 0.0329 - val_mean_absolute_error: 0.1197
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0551 - mean_absolute_error: 0.1382 - val_loss: 0.0360 - val_mean_absolute_error: 0.1269
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0381 - mean_absolute_error: 0.1226 - val_loss: 0.0251 - val_mean_absolute_error: 0.0966
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0346 - mean_absolute_error: 0.1090 - val_loss: 0.0263 - val_mean_absolute_error: 0.0993
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0388 - mean_absolute_error: 0.1267 - val_loss: 0.0309 - val_mean_absolute_error: 0.1131
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0491 - mean_absolute_error: 0.1376 - val_loss: 0.0296 - val_mean_absolute_error: 0.1106
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0416 - mean_absolute_error: 0.1165 - val_loss: 0.0277 - val_mean_absolute_error: 0.1061
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0431 - mean_absolute_error: 0.1271 - val_loss: 0.0382 - val_mean_absolute_error: 0.1259
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0365 - mean_absolute_error: 0.1103 - val_loss: 0.0363 - val_mean_absolute_error: 0.1236
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0307 - mean_absolute_error: 0.1075 - val_loss: 0.0284 - val_mean_absolute_error: 0.1081
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0290 - mean_absolute_error: 0.1011 - val_loss: 0.0272 - val_mean_absolute_error: 0.1032
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 69ms/step - loss: 0.0350 - mean_absolute_error: 0.1117 - val_loss: 0.0304 - val_mean_absolute_error: 0.1075
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.0431 - mean_absolute_error: 0.1234 - val_loss: 0.0356 - val_mean_absolute_error: 0.1243
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0351 - mean_absolute_error: 0.1163 - val_loss: 0.0279 - val_mean_absolute_error: 0.0998
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0274 - mean_absolute_error: 0.0890 - val_loss: 0.0294 - val_mean_absolute_error: 0.1056
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0318 - mean_absolute_error: 0.1045 - val_loss: 0.0293 - val_mean_absolute_error: 0.1041
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0343 - mean_absolute_error: 0.1140 - val_loss: 0.0281 - val_mean_absolute_error: 0.1006
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0343 - mean_absolute_error: 0.1128 - val_loss: 0.0260 - val_mean_absolute_error: 0.0923
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0363 - mean_absolute_error: 0.1183 - val_loss: 0.0246 - val_mean_absolute_error: 0.0916
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0287 - mean_absolute_error: 0.1020 - val_loss: 0.0296 - val_mean_absolute_error: 0.1021
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0428 - mean_absolute_error: 0.1282 - val_loss: 0.0359 - val_mean_absolute_error: 0.1231

FOLD 4
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 25ms/step - loss: 0.0463 - mean_absolute_error: 0.1390 - val_loss: 0.0181 - val_mean_absolute_error: 0.0564
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0462 - mean_absolute_error: 0.1392 - val_loss: 0.0252 - val_mean_absolute_error: 0.0948
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0418 - mean_absolute_error: 0.1336 - val_loss: 0.0208 - val_mean_absolute_error: 0.0810
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0487 - mean_absolute_error: 0.1457 - val_loss: 0.0252 - val_mean_absolute_error: 0.0855
Epoch 5/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0479 - mean_absolute_error: 0.1340 - val_loss: 0.0179 - val_mean_absolute_error: 0.0636
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0420 - mean_absolute_error: 0.1209 - val_loss: 0.0206 - val_mean_absolute_error: 0.0702
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0450 - mean_absolute_error: 0.1300 - val_loss: 0.0192 - val_mean_absolute_error: 0.0673
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0387 - mean_absolute_error: 0.1197 - val_loss: 0.0222 - val_mean_absolute_error: 0.0796
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0423 - mean_absolute_error: 0.1340 - val_loss: 0.0210 - val_mean_absolute_error: 0.0744
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0417 - mean_absolute_error: 0.1360 - val_loss: 0.0204 - val_mean_absolute_error: 0.0759
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0343 - mean_absolute_error: 0.1111 - val_loss: 0.0225 - val_mean_absolute_error: 0.0839
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0423 - mean_absolute_error: 0.1258 - val_loss: 0.0270 - val_mean_absolute_error: 0.0955
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0459 - mean_absolute_error: 0.1353 - val_loss: 0.0252 - val_mean_absolute_error: 0.0895
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0350 - mean_absolute_error: 0.1145 - val_loss: 0.0199 - val_mean_absolute_error: 0.0688
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0455 - mean_absolute_error: 0.1288 - val_loss: 0.0198 - val_mean_absolute_error: 0.0636
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0448 - mean_absolute_error: 0.1331 - val_loss: 0.0187 - val_mean_absolute_error: 0.0587
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0364 - mean_absolute_error: 0.1149 - val_loss: 0.0192 - val_mean_absolute_error: 0.0620
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0461 - mean_absolute_error: 0.1321 - val_loss: 0.0193 - val_mean_absolute_error: 0.0614
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0400 - mean_absolute_error: 0.1144 - val_loss: 0.0209 - val_mean_absolute_error: 0.0664
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0419 - mean_absolute_error: 0.1304 - val_loss: 0.0197 - val_mean_absolute_error: 0.0694
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0401 - mean_absolute_error: 0.1312 - val_loss: 0.0220 - val_mean_absolute_error: 0.0796
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0417 - mean_absolute_error: 0.1265 - val_loss: 0.0244 - val_mean_absolute_error: 0.0908
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0401 - mean_absolute_error: 0.1187 - val_loss: 0.0267 - val_mean_absolute_error: 0.0981
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0438 - mean_absolute_error: 0.1303 - val_loss: 0.0252 - val_mean_absolute_error: 0.0900
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.0279 - mean_absolute_error: 0.0882 - val_loss: 0.0231 - val_mean_absolute_error: 0.0824
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0307 - mean_absolute_error: 0.0984 - val_loss: 0.0242 - val_mean_absolute_error: 0.0876
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0428 - mean_absolute_error: 0.1108 - val_loss: 0.0271 - val_mean_absolute_error: 0.0899
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0353 - mean_absolute_error: 0.1054 - val_loss: 0.0282 - val_mean_absolute_error: 0.0879
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0546 - mean_absolute_error: 0.1386 - val_loss: 0.0263 - val_mean_absolute_error: 0.0915
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0278 - mean_absolute_error: 0.1001 - val_loss: 0.0293 - val_mean_absolute_error: 0.1071
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0312 - mean_absolute_error: 0.1075 - val_loss: 0.0296 - val_mean_absolute_error: 0.1036
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0304 - mean_absolute_error: 0.0998 - val_loss: 0.0278 - val_mean_absolute_error: 0.0924
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0341 - mean_absolute_error: 0.1050 - val_loss: 0.0282 - val_mean_absolute_error: 0.0933
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0332 - mean_absolute_error: 0.1203 - val_loss: 0.0244 - val_mean_absolute_error: 0.0916
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0284 - mean_absolute_error: 0.0999 - val_loss: 0.0241 - val_mean_absolute_error: 0.0887
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0379 - mean_absolute_error: 0.1273 - val_loss: 0.0275 - val_mean_absolute_error: 0.0942
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0393 - mean_absolute_error: 0.1311 - val_loss: 0.0258 - val_mean_absolute_error: 0.0987
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0336 - mean_absolute_error: 0.1030 - val_loss: 0.0280 - val_mean_absolute_error: 0.1026
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0342 - mean_absolute_error: 0.1099 - val_loss: 0.0285 - val_mean_absolute_error: 0.1011
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0285 - mean_absolute_error: 0.0906 - val_loss: 0.0311 - val_mean_absolute_error: 0.1042
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0315 - mean_absolute_error: 0.0984 - val_loss: 0.0307 - val_mean_absolute_error: 0.1027
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0310 - mean_absolute_error: 0.1019 - val_loss: 0.0325 - val_mean_absolute_error: 0.1148
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0343 - mean_absolute_error: 0.1135 - val_loss: 0.0339 - val_mean_absolute_error: 0.1112
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0501 - mean_absolute_error: 0.1222 - val_loss: 0.0352 - val_mean_absolute_error: 0.1192
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 0.0298 - mean_absolute_error: 0.1019 - val_loss: 0.0354 - val_mean_absolute_error: 0.1253
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0360 - mean_absolute_error: 0.1246 - val_loss: 0.0333 - val_mean_absolute_error: 0.1191
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0311 - mean_absolute_error: 0.1058 - val_loss: 0.0327 - val_mean_absolute_error: 0.1135
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0302 - mean_absolute_error: 0.0958 - val_loss: 0.0338 - val_mean_absolute_error: 0.1163
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0321 - mean_absolute_error: 0.1077 - val_loss: 0.0339 - val_mean_absolute_error: 0.1242
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0331 - mean_absolute_error: 0.1064 - val_loss: 0.0377 - val_mean_absolute_error: 0.1416

FOLD 5
Epoch 1/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - loss: 0.0309 - mean_absolute_error: 0.1052 - val_loss: 0.0198 - val_mean_absolute_error: 0.0656
Epoch 2/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0322 - mean_absolute_error: 0.1069 - val_loss: 0.0253 - val_mean_absolute_error: 0.0894
Epoch 3/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0370 - mean_absolute_error: 0.1180 - val_loss: 0.0180 - val_mean_absolute_error: 0.0634
Epoch 4/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0330 - mean_absolute_error: 0.1130 - val_loss: 0.0198 - val_mean_absolute_error: 0.0783
Epoch 5/50
2025-08-09 17:10:01.379876: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 17:10:01.380228: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0285 - mean_absolute_error: 0.0973 - val_loss: 0.0152 - val_mean_absolute_error: 0.0463
Epoch 6/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0255 - mean_absolute_error: 0.0908 - val_loss: 0.0172 - val_mean_absolute_error: 0.0557
Epoch 7/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0532 - mean_absolute_error: 0.1274 - val_loss: 0.0147 - val_mean_absolute_error: 0.0437
Epoch 8/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0248 - mean_absolute_error: 0.0913 - val_loss: 0.0217 - val_mean_absolute_error: 0.0873
Epoch 9/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0334 - mean_absolute_error: 0.1083 - val_loss: 0.0165 - val_mean_absolute_error: 0.0538
Epoch 10/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0268 - mean_absolute_error: 0.0908 - val_loss: 0.0169 - val_mean_absolute_error: 0.0600
Epoch 11/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0397 - mean_absolute_error: 0.1370 - val_loss: 0.0193 - val_mean_absolute_error: 0.0672
Epoch 12/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0305 - mean_absolute_error: 0.1002 - val_loss: 0.0251 - val_mean_absolute_error: 0.0840
Epoch 13/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0340 - mean_absolute_error: 0.1022 - val_loss: 0.0207 - val_mean_absolute_error: 0.0656
Epoch 14/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0249 - mean_absolute_error: 0.0828 - val_loss: 0.0196 - val_mean_absolute_error: 0.0735
Epoch 15/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0415 - mean_absolute_error: 0.1305 - val_loss: 0.0166 - val_mean_absolute_error: 0.0568
Epoch 16/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0256 - mean_absolute_error: 0.0900 - val_loss: 0.0185 - val_mean_absolute_error: 0.0664
Epoch 17/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0325 - mean_absolute_error: 0.1095 - val_loss: 0.0203 - val_mean_absolute_error: 0.0801
Epoch 18/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0258 - mean_absolute_error: 0.0927 - val_loss: 0.0226 - val_mean_absolute_error: 0.0810
Epoch 19/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0306 - mean_absolute_error: 0.0998 - val_loss: 0.0243 - val_mean_absolute_error: 0.0866
Epoch 20/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0284 - mean_absolute_error: 0.0968 - val_loss: 0.0209 - val_mean_absolute_error: 0.0768
Epoch 21/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0290 - mean_absolute_error: 0.0998 - val_loss: 0.0183 - val_mean_absolute_error: 0.0720
Epoch 22/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0287 - mean_absolute_error: 0.0936 - val_loss: 0.0147 - val_mean_absolute_error: 0.0462
Epoch 23/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0330 - mean_absolute_error: 0.1141 - val_loss: 0.0157 - val_mean_absolute_error: 0.0529
Epoch 24/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0291 - mean_absolute_error: 0.1001 - val_loss: 0.0202 - val_mean_absolute_error: 0.0805
Epoch 25/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0302 - mean_absolute_error: 0.1080 - val_loss: 0.0256 - val_mean_absolute_error: 0.1001
Epoch 26/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0260 - mean_absolute_error: 0.0941 - val_loss: 0.0216 - val_mean_absolute_error: 0.0843
Epoch 27/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0361 - mean_absolute_error: 0.1104 - val_loss: 0.0163 - val_mean_absolute_error: 0.0538
Epoch 28/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0244 - mean_absolute_error: 0.0814 - val_loss: 0.0154 - val_mean_absolute_error: 0.0455
Epoch 29/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0309 - mean_absolute_error: 0.1078 - val_loss: 0.0147 - val_mean_absolute_error: 0.0475
Epoch 30/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0239 - mean_absolute_error: 0.0807 - val_loss: 0.0201 - val_mean_absolute_error: 0.0752
Epoch 31/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0274 - mean_absolute_error: 0.0889 - val_loss: 0.0252 - val_mean_absolute_error: 0.0973
Epoch 32/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0305 - mean_absolute_error: 0.1032 - val_loss: 0.0206 - val_mean_absolute_error: 0.0822
Epoch 33/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0229 - mean_absolute_error: 0.0748 - val_loss: 0.0158 - val_mean_absolute_error: 0.0504
Epoch 34/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0260 - mean_absolute_error: 0.0941 - val_loss: 0.0160 - val_mean_absolute_error: 0.0517
Epoch 35/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0364 - mean_absolute_error: 0.1127 - val_loss: 0.0189 - val_mean_absolute_error: 0.0733
Epoch 36/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0252 - mean_absolute_error: 0.0902 - val_loss: 0.0240 - val_mean_absolute_error: 0.0967
Epoch 37/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0320 - mean_absolute_error: 0.1107 - val_loss: 0.0174 - val_mean_absolute_error: 0.0643
Epoch 38/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0313 - mean_absolute_error: 0.0959 - val_loss: 0.0175 - val_mean_absolute_error: 0.0578
Epoch 39/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0273 - mean_absolute_error: 0.0925 - val_loss: 0.0177 - val_mean_absolute_error: 0.0655
Epoch 40/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0344 - mean_absolute_error: 0.0979 - val_loss: 0.0229 - val_mean_absolute_error: 0.0898
Epoch 41/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0310 - mean_absolute_error: 0.1002 - val_loss: 0.0245 - val_mean_absolute_error: 0.0942
Epoch 42/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0238 - mean_absolute_error: 0.0803 - val_loss: 0.0213 - val_mean_absolute_error: 0.0830
Epoch 43/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0363 - mean_absolute_error: 0.1101 - val_loss: 0.0234 - val_mean_absolute_error: 0.0847
Epoch 44/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0250 - mean_absolute_error: 0.0837 - val_loss: 0.0348 - val_mean_absolute_error: 0.1322
Epoch 45/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 15ms/step - loss: 0.0355 - mean_absolute_error: 0.1159 - val_loss: 0.0204 - val_mean_absolute_error: 0.0749
Epoch 46/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0309 - mean_absolute_error: 0.1088 - val_loss: 0.0162 - val_mean_absolute_error: 0.0607
Epoch 47/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0234 - mean_absolute_error: 0.0810 - val_loss: 0.0179 - val_mean_absolute_error: 0.0658
Epoch 48/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0309 - mean_absolute_error: 0.0998 - val_loss: 0.0188 - val_mean_absolute_error: 0.0723
Epoch 49/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - loss: 0.0233 - mean_absolute_error: 0.0840 - val_loss: 0.0208 - val_mean_absolute_error: 0.0816
Epoch 50/50
3/3 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - loss: 0.0216 - mean_absolute_error: 0.0750 - val_loss: 0.0211 - val_mean_absolute_error: 0.0804
Validation losses: [0.3328082263469696, 0.05425812676548958, 0.03594767302274704, 0.037672773003578186, 0.021075138822197914]
HPS: {'player_emb_dim': 32, 'dense_units': 48, 'dense_units_2': 16, 'learning_rate': 0.01, 'dropout_rate': 0.1, 'dropout_rate_2': 0.1, 'dropout_rate_inter': 0.2, 'interaction_scale': 2} Avg. across folds score(MSE): 0.08650029897689819
HPS: {'player_emb_dim': 32, 'dense_units': 64, 'dense_units_2': 80, 'learning_rate': 0.001, 'dropout_rate': 0.1, 'dropout_rate_2': 0.30000000000000004, 'dropout_rate_inter': 0.1, 'interaction_scale': 4} Avg. across folds score(MSE): 0.09333872832357884
HPS: {'player_emb_dim': 32, 'dense_units': 48, 'dense_units_2': 48, 'learning_rate': 0.01, 'dropout_rate': 0.1, 'dropout_rate_2': 0.2, 'dropout_rate_inter': 0.30000000000000004, 'interaction_scale': 3} Avg. across folds score(MSE): 0.08406737744808197
HPS: {'player_emb_dim': 32, 'dense_units': 64, 'dense_units_2': 128, 'learning_rate': 0.01, 'dropout_rate': 0.4, 'dropout_rate_2': 0.2, 'dropout_rate_inter': 0.30000000000000004, 'interaction_scale': 2} Avg. across folds score(MSE): 0.24876852333545685
HPS: {'player_emb_dim': 32, 'dense_units': 48, 'dense_units_2': 112, 'learning_rate': 0.001, 'dropout_rate': 0.1, 'dropout_rate_2': 0.1, 'dropout_rate_inter': 0.30000000000000004, 'interaction_scale': 3} Avg. across folds score(MSE): 0.09635238759219647
HPS: {'player_emb_dim': 32, 'dense_units': 64, 'dense_units_2': 128, 'learning_rate': 0.01, 'dropout_rate': 0.4, 'dropout_rate_2': 0.2, 'dropout_rate_inter': 0.30000000000000004, 'interaction_scale': 2}. Avg MSE: 0.24876852333545685.
Epoch 1/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 2s 5ms/step - loss: 1.3249 - mean_absolute_error: 0.9078
Epoch 2/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 1.1866 - mean_absolute_error: 0.8829 
Epoch 3/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.7558 - mean_absolute_error: 0.6648 
Epoch 4/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.6184 - mean_absolute_error: 0.5973 
Epoch 5/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.4518 - mean_absolute_error: 0.5263 
Epoch 6/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.3500 - mean_absolute_error: 0.4718 
Epoch 7/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.3245 - mean_absolute_error: 0.4330 
Epoch 8/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2606 - mean_absolute_error: 0.4011 
Epoch 9/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2061 - mean_absolute_error: 0.3314 
Epoch 10/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2488 - mean_absolute_error: 0.3807 
Epoch 11/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2875 - mean_absolute_error: 0.3992 
Epoch 12/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2512 - mean_absolute_error: 0.3892 
Epoch 13/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2691 - mean_absolute_error: 0.3816 
Epoch 14/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2219 - mean_absolute_error: 0.3398 
Epoch 15/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.3662 - mean_absolute_error: 0.4048 
Epoch 16/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.3936 - mean_absolute_error: 0.4528 
Epoch 17/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2225 - mean_absolute_error: 0.3511 
Epoch 18/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2520 - mean_absolute_error: 0.3768 
Epoch 19/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1703 - mean_absolute_error: 0.2940 
Epoch 20/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1818 - mean_absolute_error: 0.3025 
Epoch 21/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.1794 - mean_absolute_error: 0.3103 
Epoch 22/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - loss: 0.2091 - mean_absolute_error: 0.3363 
Epoch 23/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.2204 - mean_absolute_error: 0.3410 
Epoch 24/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.2984 - mean_absolute_error: 0.3817 
Epoch 25/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.2007 - mean_absolute_error: 0.3433 
Epoch 26/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1537 - mean_absolute_error: 0.2761 
Epoch 27/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1651 - mean_absolute_error: 0.2801 
Epoch 28/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1131 - mean_absolute_error: 0.2246 
Epoch 29/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1873 - mean_absolute_error: 0.2769 
Epoch 30/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1527 - mean_absolute_error: 0.2594 
Epoch 31/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1193 - mean_absolute_error: 0.2507 
Epoch 32/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1306 - mean_absolute_error: 0.2577 
Epoch 33/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1166 - mean_absolute_error: 0.2341 
Epoch 34/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.0871 - mean_absolute_error: 0.2104 
Epoch 35/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1324 - mean_absolute_error: 0.2303 
Epoch 36/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1619 - mean_absolute_error: 0.2889 
Epoch 37/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1339 - mean_absolute_error: 0.2694 
Epoch 38/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1157 - mean_absolute_error: 0.2347 
Epoch 39/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1266 - mean_absolute_error: 0.2530 
Epoch 40/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1215 - mean_absolute_error: 0.2313 
Epoch 41/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1480 - mean_absolute_error: 0.2334 
Epoch 42/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1417 - mean_absolute_error: 0.2714 
Epoch 43/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1299 - mean_absolute_error: 0.2615 
Epoch 44/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1255 - mean_absolute_error: 0.2404 
Epoch 45/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - loss: 0.1492 - mean_absolute_error: 0.2718 
Epoch 46/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.1300 - mean_absolute_error: 0.2594 
Epoch 47/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - loss: 0.1427 - mean_absolute_error: 0.2590 
Epoch 48/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.1303 - mean_absolute_error: 0.2322 
Epoch 49/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.1444 - mean_absolute_error: 0.2473 
Epoch 50/50
4/4 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - loss: 0.1149 - mean_absolute_error: 0.2334 

Visualize results#

print(player_strengths.shape)
fig, _, _ = analyze_players_embeddings(model_inter, player_strengths, random_state=None)
fig
(31,)

player_strengths.shape: (30,)
embeddings_nd[:, 0].shape : (30,)
Embeddings shape: (30, 32)
Dimension 1 correlation with base strengths: r = 0.0717, p-value = 0.7065
Dimension 2 correlation with base strengths: r = -0.2393, p-value = 0.2027
Dimension 3 correlation with base strengths: r = 0.1776, p-value = 0.3477
Dimension 4 correlation with base strengths: r = 0.3192, p-value = 0.08559
Dimension 5 correlation with base strengths: r = 0.0321, p-value = 0.8662
Dimension 6 correlation with base strengths: r = 0.0184, p-value = 0.9229
Dimension 7 correlation with base strengths: r = 0.4702, p-value = 0.008745
Dimension 8 correlation with base strengths: r = -0.5620, p-value = 0.00123
Dimension 9 correlation with base strengths: r = 0.4430, p-value = 0.01422
Dimension 10 correlation with base strengths: r = -0.4918, p-value = 0.005775
Dimension 11 correlation with base strengths: r = -0.4155, p-value = 0.02241
Dimension 12 correlation with base strengths: r = 0.5909, p-value = 0.0005855
Dimension 13 correlation with base strengths: r = 0.2913, p-value = 0.1184
Dimension 14 correlation with base strengths: r = 0.5709, p-value = 0.0009851
Dimension 15 correlation with base strengths: r = 0.4633, p-value = 0.009919
Dimension 16 correlation with base strengths: r = -0.0003, p-value = 0.9988
Dimension 17 correlation with base strengths: r = 0.0428, p-value = 0.8225
Dimension 18 correlation with base strengths: r = 0.0463, p-value = 0.8081
Dimension 19 correlation with base strengths: r = 0.0239, p-value = 0.9003
Dimension 20 correlation with base strengths: r = 0.4650, p-value = 0.009614
Dimension 21 correlation with base strengths: r = -0.6802, p-value = 3.551e-05
Dimension 22 correlation with base strengths: r = -0.6626, p-value = 6.629e-05
Dimension 23 correlation with base strengths: r = -0.6200, p-value = 0.0002577
Dimension 24 correlation with base strengths: r = -0.0680, p-value = 0.721
Dimension 25 correlation with base strengths: r = -0.2386, p-value = 0.2042
Dimension 26 correlation with base strengths: r = 0.0802, p-value = 0.6736
Dimension 27 correlation with base strengths: r = -0.1057, p-value = 0.5784
Dimension 28 correlation with base strengths: r = 0.2461, p-value = 0.1899
Dimension 29 correlation with base strengths: r = -0.5702, p-value = 0.001002
Dimension 30 correlation with base strengths: r = 0.3270, p-value = 0.07781
Dimension 31 correlation with base strengths: r = 0.2143, p-value = 0.2555
Dimension 32 correlation with base strengths: r = 0.7693, p-value = 6.79e-07
Average absolute correlation across 32 components: 0.3224
player_strengths.shape: (30,)
embeddings_nd[:, 0].shape : (30,)
Embeddings shape: (30, 3)
Dimension 1 correlation with base strengths: r = 0.2981, p-value = 0.1096
Dimension 2 correlation with base strengths: r = 0.7076, p-value = 1.225e-05
Dimension 3 correlation with base strengths: r = 0.8901, p-value = 4.622e-11
Average absolute correlation across 3 components: 0.6319

Applying to real players’ data#

Unfortunately at the moment real player’s data is private. To respect privacy all the names have been hashed.

The challenge of real data is that we have only 19 games/competitions played and that the intrinsic complexity of the data is much higher than that of the synthetic data (many more factors out there that are affecting the outcomes and which we do not model)

team_members = ['9cdf31895733', '868ad28f0717', '9cbc7f61110f', '6475bd2141ea', 'f8640c61bdb8', '0a054ebd0bd4', 'e8f5e9fcf348', '005c4c838d4e', '69c33174130c', '322e6dad017e', '513a15ed036e', 'd52a38e07759', 'ebec404b6897', 'ac3056c6e834', 'a5099efb954f', '520ffe8f86cc', 'a1b3219ce542', '3fc696b71b14', '2e8beb1d6dd7', '777efbae29f7', 'a257b7d2b1ac', 'b481f10e4f61', '9cc5279d7053', '6ceb5dd4a971', 'bf96b228a637', '2d612596a528', '4a8f4fe29a55', '475ab8abd943', '6e8107a6c490', '9439a7b23bfb', '58c5093f8675']
team_members_with_ids = {idx + 1: tm for idx, tm in enumerate(team_members)}
team_members_with_ids
{1: '9cdf31895733',
 2: '868ad28f0717',
 3: '9cbc7f61110f',
 4: '6475bd2141ea',
 5: 'f8640c61bdb8',
 6: '0a054ebd0bd4',
 7: 'e8f5e9fcf348',
 8: '005c4c838d4e',
 9: '69c33174130c',
 10: '322e6dad017e',
 11: '513a15ed036e',
 12: 'd52a38e07759',
 13: 'ebec404b6897',
 14: 'ac3056c6e834',
 15: 'a5099efb954f',
 16: '520ffe8f86cc',
 17: 'a1b3219ce542',
 18: '3fc696b71b14',
 19: '2e8beb1d6dd7',
 20: '777efbae29f7',
 21: 'a257b7d2b1ac',
 22: 'b481f10e4f61',
 23: '9cc5279d7053',
 24: '6ceb5dd4a971',
 25: 'bf96b228a637',
 26: '2d612596a528',
 27: '4a8f4fe29a55',
 28: '475ab8abd943',
 29: '6e8107a6c490',
 30: '9439a7b23bfb',
 31: '58c5093f8675'}
player_strengths_estimates = np.array([0, 9.5, 9, 5, 6, 8.5, 6.5, 7.5, 6.5,4, 7, 6,6.5, 9.25, 4.5, 6.75, 9.25, 8.5, 
                                       7.5, 9.5, 5, 5.5, 5.5 , 9.5, 8.75, 6.5, 8, 7, 7.5, 8 , 7, 6.5])
import datetime

historical_cometitions = [
 Competition(team_a=[17, 1, 4, 22, 13], team_b=[19, 2, 16, 5, 14], score_diff=11, date=datetime.date(2025, 7, 13), scores_stats={5: 6, 2: 6, 16: 4, 19: 2, 17: 7, 4: 4, 13: 12, 1: 5}),
 Competition(team_a=[12, 16, 2, 14, 6, 4], team_b=[27, 17, 3, 30, 18, 1], score_diff=1, date=datetime.date(2025, 7, 6), scores_stats={}),
 Competition(team_a=[6, 12, 17, 2, 19, 30], team_b=[16, 3, 8, 1, 10, 9], score_diff=2, date=datetime.date(2025, 6, 22), scores_stats={2: 8, 6: 2, 19: 3, 12: 3, 17: 4, 3: 1, 1: 2, 10: 5, 9: 4, 8: 3, 16: 2}),
 Competition(team_a=[13, 18, 3, 16, 21, 4, 15, 1], team_b=[2, 17, 24, 19, 27, 31, 12, 8], score_diff=3, date=datetime.date(2025, 6, 14), scores_stats={}),
 Competition(team_a=[2, 17, 12, 8, 21], team_b=[1, 16, 9, 27, 15], score_diff=3, date=datetime.date(2025, 6, 9), scores_stats={}),
 Competition(team_a=[13, 20, 17, 3, 5, 1, 12, 24], team_b=[16, 31, 6, 18, 2, 21, 23, 4], score_diff=3, date=datetime.date(2025, 6, 8), scores_stats={5: 6, 3: 1, 17: 2, 20: 1, 1: 1, 13: 1, 24: 1, 6: 1, 2: 4, 23: 1, 18: 1, 31: 1}),
 Competition(team_a=[3, 15, 12, 1, 17, 8], team_b=[16, 4, 6, 20, 27, 2], score_diff=1, date=datetime.date(2025, 5, 25), scores_stats={8: 2, 12: 2, 17: 6, 1: 4, 3: 1, 16: 3, 6: 2, 4: 4, 2: 4}),
 Competition(team_a=[16, 5, 1, 17, 15, 10], team_b=[23, 28, 2, 18, 12, 19], score_diff=1, date=datetime.date(2025, 5, 4), scores_stats={16: 4, 17: 4, 10: 2, 5: 1, 1: 2, 15: 1, 2: 4, 12: 1, 23: 4, 18: 4}),
 Competition(team_a=[17, 27, 3, 16, 4, 28, 2], team_b=[20, 18, 13, 21, 6, 12, 1], score_diff=-2, date=datetime.date(2025, 4, 12), scores_stats={2: 7, 3: 1, 16: 1, 6: 3, 1: 3, 20: 3, 13: 1, 18: 1}),
 Competition(team_a=[3, 21, 1, 17, 6], team_b=[16, 15, 2, 12, 14], score_diff=-3, date=datetime.date(2025, 4, 6), scores_stats={17: 4, 1: 4, 21: 2, 6: 2, 3: 1, 16: 4, 2: 6, 15: 3, 14: 1, 12: 2}),
 Competition(team_a=[2, 5, 21, 23, 17], team_b=[13, 12, 22, 1, 6], score_diff=-4, date=datetime.date(2025, 3, 29), scores_stats={}),
 Competition(team_a=[17, 24, 12, 11, 3, 8, 12, 14], team_b=[4, 1, 2, 5, 23, 18, 28, 9], score_diff=-4, date=datetime.date(2025, 3, 22), scores_stats={}),
 Competition(team_a=[12, 1, 2, 13, 23, 14, 18], team_b=[20, 5, 19, 16, 3, 4, 17], score_diff=0, date=datetime.date(2025, 3, 9), scores_stats={}),
 Competition(team_a=[3, 16, 2, 4, 17, 19, 12, 23, 10], team_b=[13, 7, 1, 5, 18, 11, 29, 6, 22], score_diff=-1, date=datetime.date(2025, 3, 2), scores_stats={}),
 Competition(team_a=[16, 9, 1, 18, 3, 2, 12, 20], team_b=[13, 5, 17, 14, 7, 6, 8, 23], score_diff=1, date=datetime.date(2025, 2, 8), scores_stats={}),
 Competition(team_a=[12, 2, 13, 1, 4, 9, 8, 6], team_b=[14, 16, 7, 5, 18, 3, 17, 11], score_diff=0, date=datetime.date(2025, 2, 1), scores_stats={}),
 Competition(team_a=[17, 22, 20, 10, 5, 7], team_b=[16, 2, 28, 3, 18, 15], score_diff=0, date=datetime.date(2025, 1, 26), scores_stats={}),
 Competition(team_a=[20, 7, 12, 2, 1, 9, 10, 18], team_b=[17, 4, 13, 16, 15, 8, 14, 3], score_diff=0, date=datetime.date(2025, 1, 18), scores_stats={}),
 Competition(team_a=[15, 17, 13, 1, 3, 2, 4], team_b=[11, 12, 14, 1, 6, 16, 27], score_diff=0, date=datetime.date(2024, 12, 29), scores_stats={})
 ]

Transforming competitions to input shape data#

NUM_PLAYERS = len(team_members_with_ids)
MIN_TEAM_SIZE = 5
MAX_TEAM_SIZE = 9
NUM_GAMES = len(historical_cometitions)


teamA_data = np.zeros((NUM_GAMES, MAX_TEAM_SIZE), dtype=int)
teamB_data = np.zeros((NUM_GAMES, MAX_TEAM_SIZE), dtype=int)
outcomes = np.zeros(NUM_GAMES)

print(f"Shape of teamA_data: {teamA_data.shape}")

for comp_i, competition in enumerate(historical_cometitions):

    teamA_players = competition.team_a
    teamB_players = competition.team_b
    assert len(teamA_players) == len(teamB_players), f"Teams are supposed to be of equal size, Team A:{len(teamA_players)} Team B: {len(teamB_players)}"
    team_size = len(teamA_players)

    outcomes[comp_i] = competition.score_diff

    # print(teamA_players)
    # print(teamB_players)
    # Pad teams to max size using zeros (which corresponds to masked player)
    teamA_data[comp_i, :team_size] = teamA_players
    teamB_data[comp_i, :team_size] = teamB_players

print("teamA_data shape:", teamA_data.shape)
print("teamB_data shape:", teamB_data.shape)
print("outcomes shape:", outcomes.shape)
Shape of teamA_data: (19, 9)
teamA_data shape: (19, 9)
teamB_data shape: (19, 9)
outcomes shape: (19,)

Using same model and same hyperparameters#

from tensorflow.keras.callbacks import EarlyStopping

early_stop = EarlyStopping(monitor='val_loss', patience=5, restore_best_weights=True)

# Define a learning rate schedule function (step decay example)
def lr_schedule(epoch, lr):
    drop_rate = 0.5
    epochs_drop = 10
    if epoch > 0 and epoch % epochs_drop == 0:
        return lr * drop_rate
    return lr

#Instantiate callback
lr_scheduler = tf.keras.callbacks.LearningRateScheduler(lr_schedule)

# Or adaptive reduction on plateau (reduce LR when val_loss stalls)
reduce_lr = tf.keras.callbacks.ReduceLROnPlateau(monitor='val_loss', factor=0.5, patience=5, min_lr=1e-6)

es_callbacks=[lr_scheduler, reduce_lr, early_stop]


all_best_hps_real = hyperparameter_search(build_model_cv_atten, callbacks=es_callbacks)
teamA_data shape: (19, 9)
teamB_data shape: (19, 9)
outcomes shape: (19,)

FOLD 1
2025-08-09 14:11:36.308040: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:11:36.308617: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:11:42.288831: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:11:42.289200: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:11:48.258259: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:11:48.258687: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
/opt/anaconda3/envs/footballman/lib/python3.12/site-packages/keras/src/saving/saving_lib.py:757: UserWarning:

Skipping variable loading for optimizer 'adam', because it has 2 variables whereas the saved optimizer has 56 variables. 
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 245ms/step - loss: 29.2680 - mean_absolute_error: 4.1632
1: 29.268041610717773     2: 4.163173198699951  
29.268041610717773

FOLD 2
2025-08-09 14:11:53.598537: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:11:53.598887: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:11:59.303848: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:11:59.304191: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:12:05.729170: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:12:05.729555: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 336ms/step - loss: 1.6449 - mean_absolute_error: 1.0707
1: 1.6449332237243652     2: 1.0707132816314697  
1.6449332237243652

FOLD 3
2025-08-09 14:12:11.064998: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:12:11.065413: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:12:16.536018: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:12:16.536388: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:12:22.000211: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:12:22.000546: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 243ms/step - loss: 3.2652 - mean_absolute_error: 1.3600
1: 3.2651822566986084     2: 1.3600369691848755  
3.2651822566986084

FOLD 4
2025-08-09 14:12:27.321991: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:12:27.322362: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:12:33.078109: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:12:33.078485: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:12:39.470324: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:12:39.470638: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 332ms/step - loss: 1.5566 - mean_absolute_error: 1.0142
1: 1.556578278541565     2: 1.0141900777816772  
1.556578278541565

FOLD 5
2025-08-09 14:12:49.788158: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:12:49.788562: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:12:55.230677: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:12:55.231015: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 247ms/step - loss: 4.9411 - mean_absolute_error: 1.6880
1: 4.94112491607666     2: 1.687966227531433  
4.94112491607666

Best hyperparameters found:
player_emb_dim: 32
dense_units: 112
dense_units_2: 80
learning_rate: 0.01
dropout_rate: 0.1

Selecting best hyperparameters#

model_real, model_real_train_loss = train_best_hps_model(all_best_hps_real)
HPS: {'player_emb_dim': 32, 'dense_units': 112, 'dense_units_2': 80, 'learning_rate': 0.01, 'dropout_rate': 0.1}. MSE during RandomSearch: 29.268041610717773. Starting evaluation across all k folds...

FOLD 1
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 2s 2s/step - loss: 4.0605 - mean_absolute_error: 1.5273 - val_loss: 32.2726 - val_mean_absolute_error: 4.5183
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 2.4956 - mean_absolute_error: 1.1688
2025-08-09 14:13:01.118503: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:13:01.118883: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 2.4956 - mean_absolute_error: 1.1688 - val_loss: 32.6210 - val_mean_absolute_error: 4.4605
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.7982 - mean_absolute_error: 0.6267 - val_loss: 40.0032 - val_mean_absolute_error: 4.8308
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.5882 - mean_absolute_error: 0.5940 - val_loss: 56.4604 - val_mean_absolute_error: 5.6138
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - loss: 1.7976 - mean_absolute_error: 1.1849 - val_loss: 40.1176 - val_mean_absolute_error: 4.9150
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step - loss: 2.7080 - mean_absolute_error: 1.4918 - val_loss: 44.2076 - val_mean_absolute_error: 5.6352
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step - loss: 2.8895 - mean_absolute_error: 1.3137 - val_loss: 41.6341 - val_mean_absolute_error: 5.2534
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 1.2777 - mean_absolute_error: 0.9525 - val_loss: 38.3720 - val_mean_absolute_error: 4.7217
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.9756 - mean_absolute_error: 0.7237 - val_loss: 39.4906 - val_mean_absolute_error: 4.7341
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.8931 - mean_absolute_error: 0.6737 - val_loss: 42.1505 - val_mean_absolute_error: 4.8995
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 1.1514 - mean_absolute_error: 0.7532 - val_loss: 44.6272 - val_mean_absolute_error: 5.0498
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.6320 - mean_absolute_error: 0.6160 - val_loss: 45.7994 - val_mean_absolute_error: 5.1367
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.9146 - mean_absolute_error: 0.7254 - val_loss: 43.3016 - val_mean_absolute_error: 4.9843
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.4471 - mean_absolute_error: 0.5188 - val_loss: 38.8978 - val_mean_absolute_error: 4.7905
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.3551 - mean_absolute_error: 0.4077 - val_loss: 36.3795 - val_mean_absolute_error: 4.6604
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 0.2317 - mean_absolute_error: 0.3404 - val_loss: 35.5184 - val_mean_absolute_error: 4.6115
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.1826 - mean_absolute_error: 0.3409 - val_loss: 35.7273 - val_mean_absolute_error: 4.6507
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.5576 - mean_absolute_error: 0.4685 - val_loss: 36.4858 - val_mean_absolute_error: 4.7138
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.1036 - mean_absolute_error: 0.2288 - val_loss: 36.6703 - val_mean_absolute_error: 4.7440
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.0951 - mean_absolute_error: 0.2030 - val_loss: 36.5643 - val_mean_absolute_error: 4.7451
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - loss: 0.2153 - mean_absolute_error: 0.3500 - val_loss: 37.4281 - val_mean_absolute_error: 4.7824
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.1482 - mean_absolute_error: 0.2794 - val_loss: 38.5091 - val_mean_absolute_error: 4.8254
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 0.1761 - mean_absolute_error: 0.3287 - val_loss: 39.0994 - val_mean_absolute_error: 4.8255
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.1059 - mean_absolute_error: 0.2369 - val_loss: 39.5803 - val_mean_absolute_error: 4.8373
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.4987 - mean_absolute_error: 0.4164 - val_loss: 38.9565 - val_mean_absolute_error: 4.8189
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.1916 - mean_absolute_error: 0.2874 - val_loss: 37.2784 - val_mean_absolute_error: 4.7271
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.0998 - mean_absolute_error: 0.2121 - val_loss: 36.6188 - val_mean_absolute_error: 4.6819
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.3288 - mean_absolute_error: 0.3948 - val_loss: 35.6650 - val_mean_absolute_error: 4.6072
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.1869 - mean_absolute_error: 0.3334 - val_loss: 35.1878 - val_mean_absolute_error: 4.5595
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.0827 - mean_absolute_error: 0.2317 - val_loss: 35.4851 - val_mean_absolute_error: 4.5784
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.1740 - mean_absolute_error: 0.2702 - val_loss: 36.1154 - val_mean_absolute_error: 4.6210
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.1807 - mean_absolute_error: 0.3215 - val_loss: 37.1930 - val_mean_absolute_error: 4.7186
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.1058 - mean_absolute_error: 0.2260 - val_loss: 37.8267 - val_mean_absolute_error: 4.7816
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.1997 - mean_absolute_error: 0.3410 - val_loss: 37.6566 - val_mean_absolute_error: 4.8046
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.1330 - mean_absolute_error: 0.2748 - val_loss: 37.6367 - val_mean_absolute_error: 4.8142
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.2056 - mean_absolute_error: 0.2938 - val_loss: 37.4132 - val_mean_absolute_error: 4.7991
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.0864 - mean_absolute_error: 0.2073 - val_loss: 37.0554 - val_mean_absolute_error: 4.7660
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.1259 - mean_absolute_error: 0.2192 - val_loss: 37.3311 - val_mean_absolute_error: 4.7534
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.0754 - mean_absolute_error: 0.1744 - val_loss: 37.8120 - val_mean_absolute_error: 4.7510
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.2908 - mean_absolute_error: 0.3353 - val_loss: 37.5003 - val_mean_absolute_error: 4.7560
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.1230 - mean_absolute_error: 0.2486 - val_loss: 37.2095 - val_mean_absolute_error: 4.7700
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.1532 - mean_absolute_error: 0.2701 - val_loss: 37.2938 - val_mean_absolute_error: 4.7913
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.2789 - mean_absolute_error: 0.3616 - val_loss: 37.4709 - val_mean_absolute_error: 4.8048
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.1609 - mean_absolute_error: 0.2174 - val_loss: 37.6976 - val_mean_absolute_error: 4.8105
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.0975 - mean_absolute_error: 0.1783 - val_loss: 38.2588 - val_mean_absolute_error: 4.8309
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.0805 - mean_absolute_error: 0.1590 - val_loss: 39.2333 - val_mean_absolute_error: 4.8617
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.1691 - mean_absolute_error: 0.2894 - val_loss: 39.6527 - val_mean_absolute_error: 4.8740
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.4043 - mean_absolute_error: 0.3661 - val_loss: 39.5280 - val_mean_absolute_error: 4.8768
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.0665 - mean_absolute_error: 0.1527 - val_loss: 39.2570 - val_mean_absolute_error: 4.8699
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.1397 - mean_absolute_error: 0.2625 - val_loss: 38.5222 - val_mean_absolute_error: 4.8380

FOLD 2
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 59ms/step - loss: 10.9266 - mean_absolute_error: 1.4910 - val_loss: 0.1211 - val_mean_absolute_error: 0.2754
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 8.5581 - mean_absolute_error: 1.4309 - val_loss: 0.3760 - val_mean_absolute_error: 0.4972
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 6.8149 - mean_absolute_error: 1.7091 - val_loss: 3.3583 - val_mean_absolute_error: 1.3701
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 5.6984 - mean_absolute_error: 1.8346 - val_loss: 5.2217 - val_mean_absolute_error: 1.7847
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 3.8338 - mean_absolute_error: 1.6235 - val_loss: 3.9344 - val_mean_absolute_error: 1.7001
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.6994 - mean_absolute_error: 1.0543 - val_loss: 4.3420 - val_mean_absolute_error: 1.7787
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.9698 - mean_absolute_error: 1.1904 - val_loss: 3.3518 - val_mean_absolute_error: 1.5895
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.5962 - mean_absolute_error: 0.9513 - val_loss: 6.6655 - val_mean_absolute_error: 2.0930
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.2409 - mean_absolute_error: 1.0330 - val_loss: 1.4484 - val_mean_absolute_error: 1.0456
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.9497 - mean_absolute_error: 0.8364 - val_loss: 0.2180 - val_mean_absolute_error: 0.3553
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.2448 - mean_absolute_error: 0.8058 - val_loss: 0.9978 - val_mean_absolute_error: 0.8910
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.3847 - mean_absolute_error: 0.8584 - val_loss: 1.3918 - val_mean_absolute_error: 1.0956
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.8516 - mean_absolute_error: 0.9250 - val_loss: 0.4332 - val_mean_absolute_error: 0.5426
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.7788 - mean_absolute_error: 0.6571 - val_loss: 0.6007 - val_mean_absolute_error: 0.6985
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.9087 - mean_absolute_error: 0.8620 - val_loss: 0.7213 - val_mean_absolute_error: 0.7516
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.0236 - mean_absolute_error: 0.7841 - val_loss: 0.3975 - val_mean_absolute_error: 0.5131
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.2734 - mean_absolute_error: 0.3668 - val_loss: 0.2546 - val_mean_absolute_error: 0.4753
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.1793 - mean_absolute_error: 0.8305 - val_loss: 0.2712 - val_mean_absolute_error: 0.4319
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.6707 - mean_absolute_error: 0.5672 - val_loss: 0.3280 - val_mean_absolute_error: 0.4720
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.6450 - mean_absolute_error: 0.5237 - val_loss: 0.5811 - val_mean_absolute_error: 0.6204
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.5820 - mean_absolute_error: 0.6031 - val_loss: 0.7036 - val_mean_absolute_error: 0.7305
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.3152 - mean_absolute_error: 0.4177 - val_loss: 0.5721 - val_mean_absolute_error: 0.6297
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.0259 - mean_absolute_error: 0.6812 - val_loss: 0.6544 - val_mean_absolute_error: 0.6535
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.2589 - mean_absolute_error: 0.3280 - val_loss: 0.5829 - val_mean_absolute_error: 0.6572
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.3302 - mean_absolute_error: 0.4180 - val_loss: 0.6688 - val_mean_absolute_error: 0.6670
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.0204 - mean_absolute_error: 0.4684 - val_loss: 0.5577 - val_mean_absolute_error: 0.6550
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.2192 - mean_absolute_error: 0.3029 - val_loss: 0.4298 - val_mean_absolute_error: 0.6261
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.3813 - mean_absolute_error: 0.4845 - val_loss: 0.4387 - val_mean_absolute_error: 0.6221
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.4743 - mean_absolute_error: 0.4543 - val_loss: 0.6194 - val_mean_absolute_error: 0.6388
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.7647 - mean_absolute_error: 0.5948 - val_loss: 0.8131 - val_mean_absolute_error: 0.6853
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.5122 - mean_absolute_error: 0.5350 - val_loss: 0.7259 - val_mean_absolute_error: 0.7141
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.4959 - mean_absolute_error: 0.4664 - val_loss: 1.2312 - val_mean_absolute_error: 0.8384
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.5790 - mean_absolute_error: 0.5240 - val_loss: 1.3903 - val_mean_absolute_error: 0.8423
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.4684 - mean_absolute_error: 0.4389 - val_loss: 1.0011 - val_mean_absolute_error: 0.7814
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.7626 - mean_absolute_error: 0.4622 - val_loss: 0.8197 - val_mean_absolute_error: 0.7929
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.7884 - mean_absolute_error: 0.5930 - val_loss: 0.8961 - val_mean_absolute_error: 0.8502
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2918 - mean_absolute_error: 0.3825 - val_loss: 1.0421 - val_mean_absolute_error: 0.9056
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.8365 - mean_absolute_error: 0.5598 - val_loss: 0.9205 - val_mean_absolute_error: 0.8723
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.4031 - mean_absolute_error: 0.6842 - val_loss: 0.6379 - val_mean_absolute_error: 0.7465
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.3013 - mean_absolute_error: 0.3632 - val_loss: 0.4677 - val_mean_absolute_error: 0.5973
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.2788 - mean_absolute_error: 0.4130 - val_loss: 0.4797 - val_mean_absolute_error: 0.5571
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - loss: 0.8992 - mean_absolute_error: 0.6047 - val_loss: 0.3382 - val_mean_absolute_error: 0.4689
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.3947 - mean_absolute_error: 0.3853 - val_loss: 0.1633 - val_mean_absolute_error: 0.3412
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.7274 - mean_absolute_error: 0.5294 - val_loss: 0.1589 - val_mean_absolute_error: 0.3437
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.2269 - mean_absolute_error: 0.3383 - val_loss: 0.2837 - val_mean_absolute_error: 0.3722
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.4036 - mean_absolute_error: 0.4877 - val_loss: 0.4820 - val_mean_absolute_error: 0.5368
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.3591 - mean_absolute_error: 0.6552 - val_loss: 0.9055 - val_mean_absolute_error: 0.7576
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.3125 - mean_absolute_error: 0.3844 - val_loss: 1.1958 - val_mean_absolute_error: 0.8782
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.6572 - mean_absolute_error: 0.6230 - val_loss: 1.1193 - val_mean_absolute_error: 0.8372
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.6703 - mean_absolute_error: 0.4804 - val_loss: 0.9103 - val_mean_absolute_error: 0.7241

FOLD 3
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 67ms/step - loss: 3.2342 - mean_absolute_error: 0.7871 - val_loss: 0.1134 - val_mean_absolute_error: 0.2595
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.5244 - mean_absolute_error: 0.5996 - val_loss: 0.1378 - val_mean_absolute_error: 0.3164
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.5491 - mean_absolute_error: 0.5367 - val_loss: 0.1681 - val_mean_absolute_error: 0.3688
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 1.3396 - mean_absolute_error: 0.7425 - val_loss: 0.1962 - val_mean_absolute_error: 0.3778
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 1.9471 - mean_absolute_error: 0.9504 - val_loss: 0.2255 - val_mean_absolute_error: 0.3879
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 2.6648 - mean_absolute_error: 0.9111 - val_loss: 0.1824 - val_mean_absolute_error: 0.3394
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 1.9935 - mean_absolute_error: 0.7035 - val_loss: 0.1192 - val_mean_absolute_error: 0.2377
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.6112 - mean_absolute_error: 0.4988 - val_loss: 0.0806 - val_mean_absolute_error: 0.1651
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 1.2931 - mean_absolute_error: 0.6952 - val_loss: 0.0862 - val_mean_absolute_error: 0.1587
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.3137 - mean_absolute_error: 0.4469 - val_loss: 0.0924 - val_mean_absolute_error: 0.1725
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.8670 - mean_absolute_error: 0.8305 - val_loss: 0.0838 - val_mean_absolute_error: 0.1949
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.9883 - mean_absolute_error: 0.6443 - val_loss: 0.0698 - val_mean_absolute_error: 0.1578
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.9498 - mean_absolute_error: 0.7310 - val_loss: 0.0406 - val_mean_absolute_error: 0.0923
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.5819 - mean_absolute_error: 0.5314 - val_loss: 0.0626 - val_mean_absolute_error: 0.1728
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 1.2192 - mean_absolute_error: 0.5905 - val_loss: 0.1471 - val_mean_absolute_error: 0.2996
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.1759 - mean_absolute_error: 0.3089 - val_loss: 0.2807 - val_mean_absolute_error: 0.4291
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.2723 - mean_absolute_error: 0.3893 - val_loss: 0.3863 - val_mean_absolute_error: 0.5061
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.2924 - mean_absolute_error: 0.3966 - val_loss: 0.3787 - val_mean_absolute_error: 0.4955
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.5742 - mean_absolute_error: 0.5176 - val_loss: 0.3072 - val_mean_absolute_error: 0.4410
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.7076 - mean_absolute_error: 0.5109 - val_loss: 0.2383 - val_mean_absolute_error: 0.4086
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.4301 - mean_absolute_error: 0.4596 - val_loss: 0.1461 - val_mean_absolute_error: 0.3190
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.6753 - mean_absolute_error: 0.5120 - val_loss: 0.0890 - val_mean_absolute_error: 0.2335
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.2536 - mean_absolute_error: 0.3892 - val_loss: 0.1217 - val_mean_absolute_error: 0.2518
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.2716 - mean_absolute_error: 0.3635 - val_loss: 0.2532 - val_mean_absolute_error: 0.3387
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 1.2431 - mean_absolute_error: 0.6360 - val_loss: 0.4296 - val_mean_absolute_error: 0.4179
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step - loss: 2.7495 - mean_absolute_error: 0.7970 - val_loss: 0.4354 - val_mean_absolute_error: 0.3745
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 0.4247 - mean_absolute_error: 0.4524 - val_loss: 0.3438 - val_mean_absolute_error: 0.3415
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.4185 - mean_absolute_error: 0.4712 - val_loss: 0.3413 - val_mean_absolute_error: 0.3396
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.3008 - mean_absolute_error: 0.6005 - val_loss: 0.3588 - val_mean_absolute_error: 0.3654
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.0953 - mean_absolute_error: 0.2110 - val_loss: 0.3836 - val_mean_absolute_error: 0.4471
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.8690 - mean_absolute_error: 0.5788 - val_loss: 0.6722 - val_mean_absolute_error: 0.6178
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.4154 - mean_absolute_error: 0.4185 - val_loss: 1.2091 - val_mean_absolute_error: 0.8514
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.6444 - mean_absolute_error: 0.5640 - val_loss: 1.4659 - val_mean_absolute_error: 0.9517
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.2787 - mean_absolute_error: 0.8330 - val_loss: 1.2184 - val_mean_absolute_error: 0.8398
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.4930 - mean_absolute_error: 0.4612 - val_loss: 0.9148 - val_mean_absolute_error: 0.7111
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.4046 - mean_absolute_error: 0.4643 - val_loss: 0.6036 - val_mean_absolute_error: 0.5687
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.1583 - mean_absolute_error: 0.3170 - val_loss: 0.3696 - val_mean_absolute_error: 0.4364
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.3444 - mean_absolute_error: 0.3478 - val_loss: 0.2058 - val_mean_absolute_error: 0.3110
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.0321 - mean_absolute_error: 0.6298 - val_loss: 0.1468 - val_mean_absolute_error: 0.2500
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.6594 - mean_absolute_error: 0.7964 - val_loss: 0.3041 - val_mean_absolute_error: 0.3308
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.3365 - mean_absolute_error: 0.4021 - val_loss: 0.5916 - val_mean_absolute_error: 0.4393
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.3411 - mean_absolute_error: 0.4313 - val_loss: 0.8894 - val_mean_absolute_error: 0.5338
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.4592 - mean_absolute_error: 0.4921 - val_loss: 1.0760 - val_mean_absolute_error: 0.6093
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.3434 - mean_absolute_error: 0.6464 - val_loss: 1.1409 - val_mean_absolute_error: 0.6705
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.6541 - mean_absolute_error: 0.5800 - val_loss: 1.1000 - val_mean_absolute_error: 0.7115
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.1870 - mean_absolute_error: 0.3100 - val_loss: 0.8821 - val_mean_absolute_error: 0.6466
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 107ms/step - loss: 0.3960 - mean_absolute_error: 0.4366 - val_loss: 0.6747 - val_mean_absolute_error: 0.5565
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.3655 - mean_absolute_error: 0.4140 - val_loss: 0.4685 - val_mean_absolute_error: 0.4749
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.1321 - mean_absolute_error: 0.2671 - val_loss: 0.3677 - val_mean_absolute_error: 0.4468
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.3189 - mean_absolute_error: 0.4114 - val_loss: 0.2253 - val_mean_absolute_error: 0.3739

FOLD 4
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 62ms/step - loss: 0.9087 - mean_absolute_error: 0.5727 - val_loss: 0.0440 - val_mean_absolute_error: 0.1128
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.5745 - mean_absolute_error: 0.5149 - val_loss: 0.0367 - val_mean_absolute_error: 0.0975
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.5771 - mean_absolute_error: 0.5422 - val_loss: 0.0353 - val_mean_absolute_error: 0.0840
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.3175 - mean_absolute_error: 0.4056 - val_loss: 0.0464 - val_mean_absolute_error: 0.1309
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 0.2329 - mean_absolute_error: 0.3581
2025-08-09 14:13:07.741758: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:13:07.742128: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.2329 - mean_absolute_error: 0.3581 - val_loss: 0.0752 - val_mean_absolute_error: 0.1765
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.4683 - mean_absolute_error: 0.4487 - val_loss: 0.1204 - val_mean_absolute_error: 0.2053
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.3085 - mean_absolute_error: 0.4049 - val_loss: 0.1869 - val_mean_absolute_error: 0.2813
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 64ms/step - loss: 0.4682 - mean_absolute_error: 0.4671 - val_loss: 0.2792 - val_mean_absolute_error: 0.3444
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.3454 - mean_absolute_error: 0.3963 - val_loss: 0.3145 - val_mean_absolute_error: 0.3859
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.2944 - mean_absolute_error: 0.3921 - val_loss: 0.2638 - val_mean_absolute_error: 0.3820
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.4324 - mean_absolute_error: 0.4868 - val_loss: 0.1873 - val_mean_absolute_error: 0.3093
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.6249 - mean_absolute_error: 0.6430 - val_loss: 0.1247 - val_mean_absolute_error: 0.2071
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.3860 - mean_absolute_error: 0.4326 - val_loss: 0.0795 - val_mean_absolute_error: 0.1707
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.4786 - mean_absolute_error: 0.4602 - val_loss: 0.0513 - val_mean_absolute_error: 0.1489
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.7221 - mean_absolute_error: 0.4918 - val_loss: 0.0539 - val_mean_absolute_error: 0.1425
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.4027 - mean_absolute_error: 0.4373 - val_loss: 0.0632 - val_mean_absolute_error: 0.1414
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.6251 - mean_absolute_error: 0.4036 - val_loss: 0.0658 - val_mean_absolute_error: 0.1491
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.6030 - mean_absolute_error: 0.4785 - val_loss: 0.0574 - val_mean_absolute_error: 0.1472
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.0156 - mean_absolute_error: 0.5693 - val_loss: 0.0591 - val_mean_absolute_error: 0.1704
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.9165 - mean_absolute_error: 0.4495 - val_loss: 0.0814 - val_mean_absolute_error: 0.1890
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.2025 - mean_absolute_error: 0.2932 - val_loss: 0.1017 - val_mean_absolute_error: 0.1908
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.1540 - mean_absolute_error: 0.5942 - val_loss: 0.0665 - val_mean_absolute_error: 0.1427
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 3.7313 - mean_absolute_error: 0.8333 - val_loss: 0.0370 - val_mean_absolute_error: 0.0895
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.4075 - mean_absolute_error: 0.3432 - val_loss: 0.0405 - val_mean_absolute_error: 0.1083
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.3071 - mean_absolute_error: 0.3495 - val_loss: 0.0570 - val_mean_absolute_error: 0.1509
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.7192 - mean_absolute_error: 0.4595 - val_loss: 0.0589 - val_mean_absolute_error: 0.1492
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.5716 - mean_absolute_error: 0.4866 - val_loss: 0.0448 - val_mean_absolute_error: 0.1217
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.9980 - mean_absolute_error: 0.5036 - val_loss: 0.0330 - val_mean_absolute_error: 0.0600
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.4737 - mean_absolute_error: 0.5151 - val_loss: 0.0595 - val_mean_absolute_error: 0.1623
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.4692 - mean_absolute_error: 0.3618 - val_loss: 0.1567 - val_mean_absolute_error: 0.2881
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.3713 - mean_absolute_error: 0.4330 - val_loss: 0.2717 - val_mean_absolute_error: 0.3715
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 2.1788 - mean_absolute_error: 0.7618 - val_loss: 0.3140 - val_mean_absolute_error: 0.4017
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 83ms/step - loss: 0.3797 - mean_absolute_error: 0.4347 - val_loss: 0.2055 - val_mean_absolute_error: 0.3284
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 53ms/step - loss: 0.2745 - mean_absolute_error: 0.3972 - val_loss: 0.1372 - val_mean_absolute_error: 0.2878
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 49ms/step - loss: 0.7183 - mean_absolute_error: 0.6358 - val_loss: 0.0635 - val_mean_absolute_error: 0.1563
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 0.9970 - mean_absolute_error: 0.5445 - val_loss: 0.0650 - val_mean_absolute_error: 0.1669
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 54ms/step - loss: 0.2030 - mean_absolute_error: 0.3125 - val_loss: 0.1287 - val_mean_absolute_error: 0.2276
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 51ms/step - loss: 0.3763 - mean_absolute_error: 0.4361 - val_loss: 0.1955 - val_mean_absolute_error: 0.2707
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 52ms/step - loss: 0.2989 - mean_absolute_error: 0.3974 - val_loss: 0.2334 - val_mean_absolute_error: 0.2937
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 55ms/step - loss: 0.6055 - mean_absolute_error: 0.5483 - val_loss: 0.2440 - val_mean_absolute_error: 0.3098
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 71ms/step - loss: 0.4112 - mean_absolute_error: 0.4256 - val_loss: 0.2477 - val_mean_absolute_error: 0.3255
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.4106 - mean_absolute_error: 0.4374 - val_loss: 0.2579 - val_mean_absolute_error: 0.3384
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.5417 - mean_absolute_error: 0.5151 - val_loss: 0.2451 - val_mean_absolute_error: 0.3716
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.2380 - mean_absolute_error: 0.3858 - val_loss: 0.2295 - val_mean_absolute_error: 0.3808
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - loss: 0.3341 - mean_absolute_error: 0.3905 - val_loss: 0.2213 - val_mean_absolute_error: 0.3719
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 49ms/step - loss: 0.4901 - mean_absolute_error: 0.5152 - val_loss: 0.2157 - val_mean_absolute_error: 0.3545
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - loss: 0.1841 - mean_absolute_error: 0.3040 - val_loss: 0.2148 - val_mean_absolute_error: 0.3497
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 56ms/step - loss: 0.7016 - mean_absolute_error: 0.3919 - val_loss: 0.2062 - val_mean_absolute_error: 0.3424
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 58ms/step - loss: 0.2812 - mean_absolute_error: 0.3230 - val_loss: 0.1882 - val_mean_absolute_error: 0.3205
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 52ms/step - loss: 0.3983 - mean_absolute_error: 0.3798 - val_loss: 0.1832 - val_mean_absolute_error: 0.3033

FOLD 5
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 88ms/step - loss: 0.4895 - mean_absolute_error: 0.4138 - val_loss: 0.0936 - val_mean_absolute_error: 0.1998
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 74ms/step - loss: 0.1870 - mean_absolute_error: 0.3186 - val_loss: 0.1299 - val_mean_absolute_error: 0.2401
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 0.3020 - mean_absolute_error: 0.3371 - val_loss: 0.2129 - val_mean_absolute_error: 0.3001
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 0.3645 - mean_absolute_error: 0.3720 - val_loss: 0.2186 - val_mean_absolute_error: 0.3012
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.1550 - mean_absolute_error: 0.2588 - val_loss: 0.1710 - val_mean_absolute_error: 0.2640
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.1369 - mean_absolute_error: 0.2638 - val_loss: 0.1013 - val_mean_absolute_error: 0.1964
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 101ms/step - loss: 0.1530 - mean_absolute_error: 0.2753 - val_loss: 0.0604 - val_mean_absolute_error: 0.1472
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - loss: 0.2115 - mean_absolute_error: 0.3063 - val_loss: 0.0351 - val_mean_absolute_error: 0.0801
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - loss: 0.1336 - mean_absolute_error: 0.2735 - val_loss: 0.0297 - val_mean_absolute_error: 0.0603
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.1322 - mean_absolute_error: 0.2484 - val_loss: 0.0343 - val_mean_absolute_error: 0.0746
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - loss: 0.1166 - mean_absolute_error: 0.2311 - val_loss: 0.0297 - val_mean_absolute_error: 0.0586
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.8722 - mean_absolute_error: 0.4749 - val_loss: 0.0375 - val_mean_absolute_error: 0.1052
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.2622 - mean_absolute_error: 0.3180 - val_loss: 0.0477 - val_mean_absolute_error: 0.1367
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.2101 - mean_absolute_error: 0.2601 - val_loss: 0.0678 - val_mean_absolute_error: 0.1812
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.1606 - mean_absolute_error: 0.2244 - val_loss: 0.0664 - val_mean_absolute_error: 0.1915
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.5015 - mean_absolute_error: 0.3255 - val_loss: 0.0577 - val_mean_absolute_error: 0.1767
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 0.3673 - mean_absolute_error: 0.3163 - val_loss: 0.0451 - val_mean_absolute_error: 0.1286
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 52ms/step - loss: 0.2335 - mean_absolute_error: 0.2733 - val_loss: 0.0413 - val_mean_absolute_error: 0.1122
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 0.1183 - mean_absolute_error: 0.2112 - val_loss: 0.0552 - val_mean_absolute_error: 0.1667
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.1899 - mean_absolute_error: 0.2776 - val_loss: 0.0545 - val_mean_absolute_error: 0.1624
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.1429 - mean_absolute_error: 0.2244 - val_loss: 0.0474 - val_mean_absolute_error: 0.1470
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.1153 - mean_absolute_error: 0.2115 - val_loss: 0.0431 - val_mean_absolute_error: 0.1326
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 1.0842 - mean_absolute_error: 0.5394 - val_loss: 0.0878 - val_mean_absolute_error: 0.2259
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.3829 - mean_absolute_error: 0.3062 - val_loss: 0.1391 - val_mean_absolute_error: 0.2846
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.4272 - mean_absolute_error: 0.3978 - val_loss: 0.1724 - val_mean_absolute_error: 0.3109
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.8877 - mean_absolute_error: 0.4389 - val_loss: 0.1476 - val_mean_absolute_error: 0.2844
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.2114 - mean_absolute_error: 0.2845 - val_loss: 0.1247 - val_mean_absolute_error: 0.2547
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.7988 - mean_absolute_error: 0.4030 - val_loss: 0.0948 - val_mean_absolute_error: 0.2280
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.1246 - mean_absolute_error: 0.2448 - val_loss: 0.0775 - val_mean_absolute_error: 0.2196
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.9114 - mean_absolute_error: 0.3794 - val_loss: 0.1016 - val_mean_absolute_error: 0.2683
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.1607 - mean_absolute_error: 0.2652 - val_loss: 0.1872 - val_mean_absolute_error: 0.3623
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.3749 - mean_absolute_error: 0.2952 - val_loss: 0.2894 - val_mean_absolute_error: 0.4342
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.5847 - mean_absolute_error: 0.4088 - val_loss: 0.3559 - val_mean_absolute_error: 0.4557
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.3691 - mean_absolute_error: 0.3955 - val_loss: 0.3146 - val_mean_absolute_error: 0.4094
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.6982 - mean_absolute_error: 0.4442 - val_loss: 0.2169 - val_mean_absolute_error: 0.3230
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.4120 - mean_absolute_error: 0.3480 - val_loss: 0.1106 - val_mean_absolute_error: 0.2076
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.3686 - mean_absolute_error: 0.4523 - val_loss: 0.0922 - val_mean_absolute_error: 0.1889
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.1182 - mean_absolute_error: 0.2360 - val_loss: 0.1058 - val_mean_absolute_error: 0.2138
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.2780 - mean_absolute_error: 0.3619 - val_loss: 0.1189 - val_mean_absolute_error: 0.2550
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.3561 - mean_absolute_error: 0.3165 - val_loss: 0.1733 - val_mean_absolute_error: 0.3183
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.3810 - mean_absolute_error: 0.4168 - val_loss: 0.2007 - val_mean_absolute_error: 0.3464
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 51ms/step - loss: 0.1878 - mean_absolute_error: 0.3051 - val_loss: 0.1938 - val_mean_absolute_error: 0.3335
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.1747 - mean_absolute_error: 0.2235 - val_loss: 0.1797 - val_mean_absolute_error: 0.3151
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.0894 - mean_absolute_error: 0.1971 - val_loss: 0.1517 - val_mean_absolute_error: 0.2788
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.1679 - mean_absolute_error: 0.2593 - val_loss: 0.1421 - val_mean_absolute_error: 0.2509
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.0773 - mean_absolute_error: 0.1806 - val_loss: 0.1239 - val_mean_absolute_error: 0.2079
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.3020 - mean_absolute_error: 0.3434 - val_loss: 0.1271 - val_mean_absolute_error: 0.2143
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.2823 - mean_absolute_error: 0.3208 - val_loss: 0.1012 - val_mean_absolute_error: 0.1931
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.6683 - mean_absolute_error: 0.4122 - val_loss: 0.0562 - val_mean_absolute_error: 0.1380
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.5378 - mean_absolute_error: 0.4349 - val_loss: 0.0737 - val_mean_absolute_error: 0.1627
Validation losses: [38.52220153808594, 0.9102684855461121, 0.2253435254096985, 0.18319940567016602, 0.0737283006310463]
HPS: {'player_emb_dim': 32, 'dense_units': 32, 'dense_units_2': 48, 'learning_rate': 0.01, 'dropout_rate': 0.1}. MSE during RandomSearch: 1.6449332237243652. Starting evaluation across all k folds...

FOLD 1
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 2s 2s/step - loss: 4.2410 - mean_absolute_error: 1.5649 - val_loss: 38.5559 - val_mean_absolute_error: 4.7837
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 2.7722 - mean_absolute_error: 1.2613 - val_loss: 38.2112 - val_mean_absolute_error: 4.7440
Epoch 3/50
2025-08-09 14:13:14.009025: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:13:14.009385: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 2.1547 - mean_absolute_error: 1.1057 - val_loss: 34.9771 - val_mean_absolute_error: 4.4990
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.7224 - mean_absolute_error: 0.6726 - val_loss: 34.3773 - val_mean_absolute_error: 4.4107
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.5175 - mean_absolute_error: 0.5344 - val_loss: 35.6804 - val_mean_absolute_error: 4.4573
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2471 - mean_absolute_error: 0.4370 - val_loss: 38.6569 - val_mean_absolute_error: 4.6235
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.4430 - mean_absolute_error: 0.5041 - val_loss: 41.6035 - val_mean_absolute_error: 4.9130
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.2970 - mean_absolute_error: 0.4297 - val_loss: 44.5630 - val_mean_absolute_error: 5.1088
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.8806 - mean_absolute_error: 1.0060 - val_loss: 42.9412 - val_mean_absolute_error: 5.0830
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.0419 - mean_absolute_error: 0.8108 - val_loss: 38.7987 - val_mean_absolute_error: 4.8624
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.3161 - mean_absolute_error: 0.8888 - val_loss: 34.6920 - val_mean_absolute_error: 4.4686
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.6254 - mean_absolute_error: 0.6035 - val_loss: 33.8906 - val_mean_absolute_error: 4.3906
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.5660 - mean_absolute_error: 0.6314 - val_loss: 35.2934 - val_mean_absolute_error: 4.5093
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.3104 - mean_absolute_error: 0.4341 - val_loss: 37.0682 - val_mean_absolute_error: 4.6227
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.6225 - mean_absolute_error: 0.6854 - val_loss: 38.1108 - val_mean_absolute_error: 4.6575
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3845 - mean_absolute_error: 0.5138 - val_loss: 38.8677 - val_mean_absolute_error: 4.6569
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.8398 - mean_absolute_error: 0.6703 - val_loss: 38.9909 - val_mean_absolute_error: 4.7405
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.0178 - mean_absolute_error: 0.7371 - val_loss: 38.7175 - val_mean_absolute_error: 4.7235
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.4832 - mean_absolute_error: 0.4695 - val_loss: 38.6305 - val_mean_absolute_error: 4.6786
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3471 - mean_absolute_error: 0.4365 - val_loss: 38.4924 - val_mean_absolute_error: 4.6668
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.2931 - mean_absolute_error: 0.4110 - val_loss: 38.6567 - val_mean_absolute_error: 4.6820
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.2161 - mean_absolute_error: 0.3777 - val_loss: 38.9795 - val_mean_absolute_error: 4.7030
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2291 - mean_absolute_error: 0.3217 - val_loss: 39.0718 - val_mean_absolute_error: 4.6993
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.2002 - mean_absolute_error: 0.3103 - val_loss: 38.9786 - val_mean_absolute_error: 4.6847
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 51ms/step - loss: 0.1397 - mean_absolute_error: 0.2171 - val_loss: 39.1637 - val_mean_absolute_error: 4.6949
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.2455 - mean_absolute_error: 0.3886 - val_loss: 39.2347 - val_mean_absolute_error: 4.7203
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1874 - mean_absolute_error: 0.2941 - val_loss: 39.5539 - val_mean_absolute_error: 4.7506
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2435 - mean_absolute_error: 0.3443 - val_loss: 39.4509 - val_mean_absolute_error: 4.7317
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1887 - mean_absolute_error: 0.3274 - val_loss: 39.1824 - val_mean_absolute_error: 4.7215
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.1365 - mean_absolute_error: 0.2946 - val_loss: 39.1343 - val_mean_absolute_error: 4.7189
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1898 - mean_absolute_error: 0.3129 - val_loss: 39.1320 - val_mean_absolute_error: 4.7173
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1169 - mean_absolute_error: 0.2589 - val_loss: 39.1572 - val_mean_absolute_error: 4.7202
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.0949 - mean_absolute_error: 0.2097 - val_loss: 39.0201 - val_mean_absolute_error: 4.7159
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.4288 - mean_absolute_error: 0.3656 - val_loss: 39.0414 - val_mean_absolute_error: 4.7282
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.3025 - mean_absolute_error: 0.3426 - val_loss: 39.1977 - val_mean_absolute_error: 4.7494
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1798 - mean_absolute_error: 0.3033 - val_loss: 39.3614 - val_mean_absolute_error: 4.7838
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.5295 - mean_absolute_error: 0.3725 - val_loss: 39.7386 - val_mean_absolute_error: 4.8344
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2167 - mean_absolute_error: 0.3157 - val_loss: 39.7085 - val_mean_absolute_error: 4.8427
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3643 - mean_absolute_error: 0.4030 - val_loss: 39.3806 - val_mean_absolute_error: 4.8279
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1799 - mean_absolute_error: 0.3047 - val_loss: 39.3357 - val_mean_absolute_error: 4.8164
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.4528 - mean_absolute_error: 0.3914 - val_loss: 39.3733 - val_mean_absolute_error: 4.7906
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.2898 - mean_absolute_error: 0.3351 - val_loss: 39.2810 - val_mean_absolute_error: 4.7737
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.3420 - mean_absolute_error: 0.3828 - val_loss: 39.0428 - val_mean_absolute_error: 4.7319
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1800 - mean_absolute_error: 0.3078 - val_loss: 38.9219 - val_mean_absolute_error: 4.6988
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3562 - mean_absolute_error: 0.4502 - val_loss: 38.9632 - val_mean_absolute_error: 4.6991
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1143 - mean_absolute_error: 0.2470 - val_loss: 38.9022 - val_mean_absolute_error: 4.7023
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.0715 - mean_absolute_error: 0.1691 - val_loss: 38.4182 - val_mean_absolute_error: 4.6927
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1807 - mean_absolute_error: 0.2957 - val_loss: 38.0923 - val_mean_absolute_error: 4.6766
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.2234 - mean_absolute_error: 0.3309 - val_loss: 37.9899 - val_mean_absolute_error: 4.6620
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.1570 - mean_absolute_error: 0.2657 - val_loss: 38.2201 - val_mean_absolute_error: 4.6737

FOLD 2
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 62ms/step - loss: 10.3653 - mean_absolute_error: 1.4906 - val_loss: 0.0289 - val_mean_absolute_error: 0.1113
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 8.9842 - mean_absolute_error: 1.3962 - val_loss: 0.1446 - val_mean_absolute_error: 0.2571
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 7.6098 - mean_absolute_error: 1.5583 - val_loss: 0.4886 - val_mean_absolute_error: 0.5268
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 8.6531 - mean_absolute_error: 1.8681 - val_loss: 0.3061 - val_mean_absolute_error: 0.4361
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 7.3008 - mean_absolute_error: 1.7561 - val_loss: 0.1368 - val_mean_absolute_error: 0.3142
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 6.0361 - mean_absolute_error: 1.4061 - val_loss: 0.2753 - val_mean_absolute_error: 0.4113
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 7.1573 - mean_absolute_error: 1.5191 - val_loss: 0.7590 - val_mean_absolute_error: 0.6460
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 4.4190 - mean_absolute_error: 1.4348 - val_loss: 2.4314 - val_mean_absolute_error: 1.0050
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 3.5669 - mean_absolute_error: 1.4537 - val_loss: 6.8353 - val_mean_absolute_error: 1.5378
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.7042 - mean_absolute_error: 0.9569 - val_loss: 15.1036 - val_mean_absolute_error: 2.1683
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.0543 - mean_absolute_error: 0.9818 - val_loss: 20.6972 - val_mean_absolute_error: 2.5264
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.8503 - mean_absolute_error: 1.1458 - val_loss: 18.9349 - val_mean_absolute_error: 2.4688
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.4347 - mean_absolute_error: 1.1465 - val_loss: 13.6970 - val_mean_absolute_error: 2.1558
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.6915 - mean_absolute_error: 0.9964 - val_loss: 8.6545 - val_mean_absolute_error: 1.7526
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.8510 - mean_absolute_error: 1.1125 - val_loss: 6.5046 - val_mean_absolute_error: 1.5588
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 7.1115 - mean_absolute_error: 1.6371 - val_loss: 6.8613 - val_mean_absolute_error: 1.5718
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.8946 - mean_absolute_error: 1.0148 - val_loss: 7.7286 - val_mean_absolute_error: 1.6648
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.6436 - mean_absolute_error: 1.1159 - val_loss: 11.0446 - val_mean_absolute_error: 1.9634
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.3671 - mean_absolute_error: 0.7178 - val_loss: 14.9463 - val_mean_absolute_error: 2.2850
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.9255 - mean_absolute_error: 0.7271 - val_loss: 21.7741 - val_mean_absolute_error: 2.7337
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.1025 - mean_absolute_error: 0.8011 - val_loss: 26.4685 - val_mean_absolute_error: 2.9657
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.1426 - mean_absolute_error: 0.8996 - val_loss: 20.3512 - val_mean_absolute_error: 2.6566
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.2549 - mean_absolute_error: 0.8620 - val_loss: 11.4856 - val_mean_absolute_error: 2.0381
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.9320 - mean_absolute_error: 0.8185 - val_loss: 7.2699 - val_mean_absolute_error: 1.9581
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.0977 - mean_absolute_error: 0.7453 - val_loss: 5.7235 - val_mean_absolute_error: 1.9229
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 91ms/step - loss: 1.6766 - mean_absolute_error: 0.9036 - val_loss: 6.3951 - val_mean_absolute_error: 2.0610
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.3324 - mean_absolute_error: 0.8385 - val_loss: 9.0141 - val_mean_absolute_error: 2.3551
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.9065 - mean_absolute_error: 0.7251 - val_loss: 13.5809 - val_mean_absolute_error: 2.6387
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.7970 - mean_absolute_error: 0.7384 - val_loss: 20.9470 - val_mean_absolute_error: 2.9668
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.8911 - mean_absolute_error: 0.6870 - val_loss: 22.8040 - val_mean_absolute_error: 2.9747
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.4164 - mean_absolute_error: 0.8743 - val_loss: 18.3199 - val_mean_absolute_error: 2.7162
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.6974 - mean_absolute_error: 0.6893 - val_loss: 15.4893 - val_mean_absolute_error: 2.5938
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.1926 - mean_absolute_error: 0.7053 - val_loss: 9.9843 - val_mean_absolute_error: 2.2406
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.1507 - mean_absolute_error: 0.8168 - val_loss: 7.5345 - val_mean_absolute_error: 2.0338
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.3327 - mean_absolute_error: 0.8543 - val_loss: 7.0752 - val_mean_absolute_error: 2.0000
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.4200 - mean_absolute_error: 0.5375 - val_loss: 6.5756 - val_mean_absolute_error: 1.9558
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.5218 - mean_absolute_error: 0.4836 - val_loss: 6.4985 - val_mean_absolute_error: 1.9445
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.3433 - mean_absolute_error: 0.4910 - val_loss: 5.7776 - val_mean_absolute_error: 1.8211
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.2612 - mean_absolute_error: 0.7050 - val_loss: 6.5409 - val_mean_absolute_error: 1.9144
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.5094 - mean_absolute_error: 0.5181 - val_loss: 6.9511 - val_mean_absolute_error: 1.9562
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.6341 - mean_absolute_error: 0.7645 - val_loss: 8.3734 - val_mean_absolute_error: 2.0671
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.5775 - mean_absolute_error: 0.5437 - val_loss: 10.4133 - val_mean_absolute_error: 2.1837
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.0716 - mean_absolute_error: 0.7163 - val_loss: 9.8523 - val_mean_absolute_error: 2.1303
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.1074 - mean_absolute_error: 0.7531 - val_loss: 7.6528 - val_mean_absolute_error: 1.9203
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.4081 - mean_absolute_error: 0.4972 - val_loss: 6.1155 - val_mean_absolute_error: 1.7598
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2306 - mean_absolute_error: 0.3698 - val_loss: 4.8692 - val_mean_absolute_error: 1.6124
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.4238 - mean_absolute_error: 0.5125 - val_loss: 4.4159 - val_mean_absolute_error: 1.5640
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.8099 - mean_absolute_error: 0.5872 - val_loss: 4.8705 - val_mean_absolute_error: 1.6262
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.2296 - mean_absolute_error: 0.3272 - val_loss: 5.6594 - val_mean_absolute_error: 1.7162
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.3661 - mean_absolute_error: 0.7017 - val_loss: 7.8227 - val_mean_absolute_error: 1.9353

FOLD 3
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 65ms/step - loss: 3.1675 - mean_absolute_error: 0.9916 - val_loss: 0.1016 - val_mean_absolute_error: 0.2557
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 103ms/step - loss: 1.7183 - mean_absolute_error: 0.8198 - val_loss: 0.1669 - val_mean_absolute_error: 0.2706
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 79ms/step - loss: 2.5944 - mean_absolute_error: 0.8736 - val_loss: 0.3333 - val_mean_absolute_error: 0.3627
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.6638 - mean_absolute_error: 0.5766 - val_loss: 0.4651 - val_mean_absolute_error: 0.4499
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.6221 - mean_absolute_error: 0.6458 - val_loss: 0.5066 - val_mean_absolute_error: 0.4774
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.3532 - mean_absolute_error: 0.8554 - val_loss: 0.4728 - val_mean_absolute_error: 0.4494
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.9813 - mean_absolute_error: 0.7078 - val_loss: 0.4220 - val_mean_absolute_error: 0.4858
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.9788 - mean_absolute_error: 0.7765 - val_loss: 0.4052 - val_mean_absolute_error: 0.5314
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.2585 - mean_absolute_error: 0.8519 - val_loss: 0.4976 - val_mean_absolute_error: 0.6567
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.9620 - mean_absolute_error: 0.7783 - val_loss: 0.5926 - val_mean_absolute_error: 0.6949
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.3403 - mean_absolute_error: 0.9598 - val_loss: 0.5399 - val_mean_absolute_error: 0.6609
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.9724 - mean_absolute_error: 0.8007 - val_loss: 0.4161 - val_mean_absolute_error: 0.5886
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.5144 - mean_absolute_error: 0.6020 - val_loss: 0.3420 - val_mean_absolute_error: 0.4830
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.0846 - mean_absolute_error: 0.6561 - val_loss: 0.3360 - val_mean_absolute_error: 0.3973
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.5795 - mean_absolute_error: 0.6377 - val_loss: 0.3806 - val_mean_absolute_error: 0.4204
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.5130 - mean_absolute_error: 0.4795 - val_loss: 0.4501 - val_mean_absolute_error: 0.4909
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.2219 - mean_absolute_error: 0.6655 - val_loss: 0.5203 - val_mean_absolute_error: 0.5641
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.5019 - mean_absolute_error: 0.4881 - val_loss: 0.6329 - val_mean_absolute_error: 0.6647
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.0916 - mean_absolute_error: 0.5773 - val_loss: 0.7911 - val_mean_absolute_error: 0.7753
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2989 - mean_absolute_error: 0.3799 - val_loss: 0.9876 - val_mean_absolute_error: 0.8604
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.5685 - mean_absolute_error: 0.4638 - val_loss: 1.2358 - val_mean_absolute_error: 0.9292
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.6464 - mean_absolute_error: 0.5400 - val_loss: 1.5654 - val_mean_absolute_error: 0.9514
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.3606 - mean_absolute_error: 0.4467 - val_loss: 2.1078 - val_mean_absolute_error: 0.9672
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.9024 - mean_absolute_error: 0.8907 - val_loss: 2.2640 - val_mean_absolute_error: 0.9251
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.3615 - mean_absolute_error: 0.4958 - val_loss: 2.1718 - val_mean_absolute_error: 0.9349
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.5774 - mean_absolute_error: 0.6044 - val_loss: 2.0340 - val_mean_absolute_error: 0.9612
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.9634 - mean_absolute_error: 0.6388 - val_loss: 1.8603 - val_mean_absolute_error: 0.9700
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.1678 - mean_absolute_error: 0.3062 - val_loss: 1.7604 - val_mean_absolute_error: 0.9618
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3529 - mean_absolute_error: 0.4227 - val_loss: 1.6331 - val_mean_absolute_error: 0.9246
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.2443 - mean_absolute_error: 0.4722 - val_loss: 1.5466 - val_mean_absolute_error: 0.8951
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.0990 - mean_absolute_error: 0.2417 - val_loss: 1.4856 - val_mean_absolute_error: 0.8596
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 82ms/step - loss: 0.1973 - mean_absolute_error: 0.3039 - val_loss: 1.5108 - val_mean_absolute_error: 0.8064
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - loss: 0.2216 - mean_absolute_error: 0.3369 - val_loss: 1.5653 - val_mean_absolute_error: 0.8260
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.1932 - mean_absolute_error: 0.3064 - val_loss: 1.5317 - val_mean_absolute_error: 0.8362
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.1659 - mean_absolute_error: 0.2848 - val_loss: 1.5722 - val_mean_absolute_error: 0.8888
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3469 - mean_absolute_error: 0.3910 - val_loss: 1.5211 - val_mean_absolute_error: 0.8950
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1560 - mean_absolute_error: 0.2637 - val_loss: 1.4169 - val_mean_absolute_error: 0.8427
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.4881 - mean_absolute_error: 0.5459 - val_loss: 1.3620 - val_mean_absolute_error: 0.8768
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.5326 - mean_absolute_error: 0.4459 - val_loss: 1.3728 - val_mean_absolute_error: 0.9347
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.4062 - mean_absolute_error: 0.4872 - val_loss: 1.4034 - val_mean_absolute_error: 0.9412
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.9153 - mean_absolute_error: 0.5335 - val_loss: 1.4393 - val_mean_absolute_error: 0.9054
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.6001 - mean_absolute_error: 0.4492 - val_loss: 1.4941 - val_mean_absolute_error: 0.8339
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.1641 - mean_absolute_error: 0.2473 - val_loss: 1.5771 - val_mean_absolute_error: 0.7728
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.8309 - mean_absolute_error: 0.4909 - val_loss: 1.6717 - val_mean_absolute_error: 0.7636
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.2701 - mean_absolute_error: 0.5803 - val_loss: 1.7642 - val_mean_absolute_error: 0.7631
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2689 - mean_absolute_error: 0.3626 - val_loss: 1.7959 - val_mean_absolute_error: 0.7368
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2423 - mean_absolute_error: 0.3778 - val_loss: 1.7606 - val_mean_absolute_error: 0.7131
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.4738 - mean_absolute_error: 0.3946 - val_loss: 1.6864 - val_mean_absolute_error: 0.6824
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.1558 - mean_absolute_error: 0.2916 - val_loss: 1.5307 - val_mean_absolute_error: 0.7210
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.6560 - mean_absolute_error: 0.5004 - val_loss: 1.3784 - val_mean_absolute_error: 0.7648

FOLD 4
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 66ms/step - loss: 1.8805 - mean_absolute_error: 0.6864 - val_loss: 0.0864 - val_mean_absolute_error: 0.2636
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.7325 - mean_absolute_error: 0.5707 - val_loss: 0.1996 - val_mean_absolute_error: 0.4340
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.3858 - mean_absolute_error: 0.9787 - val_loss: 0.2040 - val_mean_absolute_error: 0.4359
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 3.1543 - mean_absolute_error: 1.0886 - val_loss: 0.1312 - val_mean_absolute_error: 0.3395
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 12ms/step - loss: 0.7587 - mean_absolute_error: 0.6747
2025-08-09 14:13:20.272415: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:13:20.272746: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.7587 - mean_absolute_error: 0.6747 - val_loss: 0.0765 - val_mean_absolute_error: 0.2373
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.5802 - mean_absolute_error: 0.5345 - val_loss: 0.0462 - val_mean_absolute_error: 0.1474
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2745 - mean_absolute_error: 0.4175 - val_loss: 0.0409 - val_mean_absolute_error: 0.1274
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2762 - mean_absolute_error: 0.3592 - val_loss: 0.0410 - val_mean_absolute_error: 0.1192
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.1858 - mean_absolute_error: 0.6576 - val_loss: 0.0507 - val_mean_absolute_error: 0.1431
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.3879 - mean_absolute_error: 0.6730 - val_loss: 0.0805 - val_mean_absolute_error: 0.2177
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.5870 - mean_absolute_error: 0.4244 - val_loss: 0.1380 - val_mean_absolute_error: 0.3204
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.2667 - mean_absolute_error: 0.6796 - val_loss: 0.2376 - val_mean_absolute_error: 0.4335
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.4377 - mean_absolute_error: 0.4852 - val_loss: 0.3190 - val_mean_absolute_error: 0.4892
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.5555 - mean_absolute_error: 0.5602 - val_loss: 0.4061 - val_mean_absolute_error: 0.4977
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 85ms/step - loss: 1.1902 - mean_absolute_error: 0.7214 - val_loss: 0.3742 - val_mean_absolute_error: 0.4803
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 89ms/step - loss: 0.7759 - mean_absolute_error: 0.5139 - val_loss: 0.3483 - val_mean_absolute_error: 0.4450
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 74ms/step - loss: 1.7757 - mean_absolute_error: 0.6924 - val_loss: 0.1997 - val_mean_absolute_error: 0.3178
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.4548 - mean_absolute_error: 0.5305 - val_loss: 0.1198 - val_mean_absolute_error: 0.2408
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 60ms/step - loss: 1.3904 - mean_absolute_error: 0.5580 - val_loss: 0.0487 - val_mean_absolute_error: 0.1551
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 66ms/step - loss: 0.3874 - mean_absolute_error: 0.5213 - val_loss: 0.0283 - val_mean_absolute_error: 0.1084
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 49ms/step - loss: 0.4465 - mean_absolute_error: 0.3997 - val_loss: 0.0483 - val_mean_absolute_error: 0.1232
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 51ms/step - loss: 0.3170 - mean_absolute_error: 0.4288 - val_loss: 0.0958 - val_mean_absolute_error: 0.2177
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.4735 - mean_absolute_error: 0.4231 - val_loss: 0.1555 - val_mean_absolute_error: 0.2930
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.4119 - mean_absolute_error: 0.4498 - val_loss: 0.2278 - val_mean_absolute_error: 0.3767
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - loss: 0.3082 - mean_absolute_error: 0.4425 - val_loss: 0.3126 - val_mean_absolute_error: 0.4794
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 60ms/step - loss: 0.5511 - mean_absolute_error: 0.5304 - val_loss: 0.3353 - val_mean_absolute_error: 0.5117
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - loss: 0.4086 - mean_absolute_error: 0.5255 - val_loss: 0.2875 - val_mean_absolute_error: 0.4745
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 2.4895 - mean_absolute_error: 0.7634 - val_loss: 0.2420 - val_mean_absolute_error: 0.4445
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 51ms/step - loss: 0.3432 - mean_absolute_error: 0.4400 - val_loss: 0.1960 - val_mean_absolute_error: 0.4068
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.2416 - mean_absolute_error: 0.4119 - val_loss: 0.1711 - val_mean_absolute_error: 0.3843
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.1651 - mean_absolute_error: 0.3022 - val_loss: 0.1546 - val_mean_absolute_error: 0.3624
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 77ms/step - loss: 0.1466 - mean_absolute_error: 0.3081 - val_loss: 0.1521 - val_mean_absolute_error: 0.3531
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 54ms/step - loss: 0.2342 - mean_absolute_error: 0.3902 - val_loss: 0.1735 - val_mean_absolute_error: 0.3672
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.3401 - mean_absolute_error: 0.3703 - val_loss: 0.1964 - val_mean_absolute_error: 0.3815
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.8507 - mean_absolute_error: 0.4491 - val_loss: 0.1716 - val_mean_absolute_error: 0.3546
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.2191 - mean_absolute_error: 0.2972 - val_loss: 0.1500 - val_mean_absolute_error: 0.3323
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.0239 - mean_absolute_error: 0.5211 - val_loss: 0.1553 - val_mean_absolute_error: 0.3403
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.2164 - mean_absolute_error: 0.3007 - val_loss: 0.1522 - val_mean_absolute_error: 0.3378
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.1641 - mean_absolute_error: 0.3185 - val_loss: 0.1692 - val_mean_absolute_error: 0.3611
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - loss: 0.3587 - mean_absolute_error: 0.4129 - val_loss: 0.1884 - val_mean_absolute_error: 0.3827
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 160ms/step - loss: 0.1815 - mean_absolute_error: 0.2385 - val_loss: 0.2046 - val_mean_absolute_error: 0.4006
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 178ms/step - loss: 0.2466 - mean_absolute_error: 0.3447 - val_loss: 0.2324 - val_mean_absolute_error: 0.4308
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.1502 - mean_absolute_error: 0.2497 - val_loss: 0.2915 - val_mean_absolute_error: 0.4881
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.1422 - mean_absolute_error: 0.2492 - val_loss: 0.3011 - val_mean_absolute_error: 0.4987
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.4032 - mean_absolute_error: 0.3993 - val_loss: 0.2819 - val_mean_absolute_error: 0.4841
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 54ms/step - loss: 0.3358 - mean_absolute_error: 0.4235 - val_loss: 0.2805 - val_mean_absolute_error: 0.4839
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 55ms/step - loss: 0.2769 - mean_absolute_error: 0.3295 - val_loss: 0.2631 - val_mean_absolute_error: 0.4659
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 73ms/step - loss: 0.6216 - mean_absolute_error: 0.4263 - val_loss: 0.2713 - val_mean_absolute_error: 0.4718
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 69ms/step - loss: 0.2971 - mean_absolute_error: 0.4229 - val_loss: 0.3183 - val_mean_absolute_error: 0.5115
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 65ms/step - loss: 0.6434 - mean_absolute_error: 0.3069 - val_loss: 0.3156 - val_mean_absolute_error: 0.5055

FOLD 5
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 80ms/step - loss: 0.8304 - mean_absolute_error: 0.6039 - val_loss: 0.1049 - val_mean_absolute_error: 0.2701
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.1240 - mean_absolute_error: 0.2605 - val_loss: 0.1424 - val_mean_absolute_error: 0.3323
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.0896 - mean_absolute_error: 0.1887 - val_loss: 0.1747 - val_mean_absolute_error: 0.3853
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2127 - mean_absolute_error: 0.3092 - val_loss: 0.1943 - val_mean_absolute_error: 0.4154
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.9240 - mean_absolute_error: 0.5231 - val_loss: 0.2131 - val_mean_absolute_error: 0.4327
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.4746 - mean_absolute_error: 0.4064 - val_loss: 0.2095 - val_mean_absolute_error: 0.4258
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.0862 - mean_absolute_error: 0.4489 - val_loss: 0.1705 - val_mean_absolute_error: 0.3830
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.3399 - mean_absolute_error: 0.3388 - val_loss: 0.1324 - val_mean_absolute_error: 0.3299
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2647 - mean_absolute_error: 0.3547 - val_loss: 0.1237 - val_mean_absolute_error: 0.3087
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.3992 - mean_absolute_error: 0.3335 - val_loss: 0.1292 - val_mean_absolute_error: 0.3154
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1649 - mean_absolute_error: 0.2794 - val_loss: 0.1267 - val_mean_absolute_error: 0.3252
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2805 - mean_absolute_error: 0.3231 - val_loss: 0.1465 - val_mean_absolute_error: 0.3499
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.0735 - mean_absolute_error: 0.2045 - val_loss: 0.1569 - val_mean_absolute_error: 0.3657
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.1625 - mean_absolute_error: 0.4194 - val_loss: 0.2040 - val_mean_absolute_error: 0.4229
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1640 - mean_absolute_error: 0.2902 - val_loss: 0.2748 - val_mean_absolute_error: 0.4858
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.6082 - mean_absolute_error: 0.4199 - val_loss: 0.3634 - val_mean_absolute_error: 0.5486
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1904 - mean_absolute_error: 0.3010 - val_loss: 0.3838 - val_mean_absolute_error: 0.5628
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.8117 - mean_absolute_error: 0.3996 - val_loss: 0.3605 - val_mean_absolute_error: 0.5444
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.3318 - mean_absolute_error: 0.3584 - val_loss: 0.2798 - val_mean_absolute_error: 0.4827
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.8824 - mean_absolute_error: 0.4682 - val_loss: 0.1657 - val_mean_absolute_error: 0.3791
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2909 - mean_absolute_error: 0.2802 - val_loss: 0.1038 - val_mean_absolute_error: 0.2939
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.8613 - mean_absolute_error: 0.4054 - val_loss: 0.0628 - val_mean_absolute_error: 0.2090
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1474 - mean_absolute_error: 0.2734 - val_loss: 0.0590 - val_mean_absolute_error: 0.1918
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 87ms/step - loss: 0.2841 - mean_absolute_error: 0.3770 - val_loss: 0.1074 - val_mean_absolute_error: 0.2500
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.9177 - mean_absolute_error: 0.4009 - val_loss: 0.2022 - val_mean_absolute_error: 0.3251
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.6480 - mean_absolute_error: 0.4224 - val_loss: 0.3072 - val_mean_absolute_error: 0.3982
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.6041 - mean_absolute_error: 0.5189 - val_loss: 0.4117 - val_mean_absolute_error: 0.4962
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.1212 - mean_absolute_error: 0.5091 - val_loss: 0.4543 - val_mean_absolute_error: 0.5782
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2684 - mean_absolute_error: 0.2654 - val_loss: 0.4890 - val_mean_absolute_error: 0.6464
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.3484 - mean_absolute_error: 0.2862 - val_loss: 0.4998 - val_mean_absolute_error: 0.6818
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2627 - mean_absolute_error: 0.3327 - val_loss: 0.4790 - val_mean_absolute_error: 0.6797
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1594 - mean_absolute_error: 0.3032 - val_loss: 0.4783 - val_mean_absolute_error: 0.6788
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.5938 - mean_absolute_error: 0.5415 - val_loss: 0.4828 - val_mean_absolute_error: 0.6720
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.3262 - mean_absolute_error: 0.5448 - val_loss: 0.4227 - val_mean_absolute_error: 0.6167
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.3862 - mean_absolute_error: 0.3968 - val_loss: 0.3387 - val_mean_absolute_error: 0.5295
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2321 - mean_absolute_error: 0.2555 - val_loss: 0.2637 - val_mean_absolute_error: 0.4329
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.3910 - mean_absolute_error: 0.3363 - val_loss: 0.1718 - val_mean_absolute_error: 0.3204
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1141 - mean_absolute_error: 0.2504 - val_loss: 0.0991 - val_mean_absolute_error: 0.2553
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2399 - mean_absolute_error: 0.3697 - val_loss: 0.0620 - val_mean_absolute_error: 0.1861
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2102 - mean_absolute_error: 0.3465 - val_loss: 0.0829 - val_mean_absolute_error: 0.2388
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2502 - mean_absolute_error: 0.2911 - val_loss: 0.1378 - val_mean_absolute_error: 0.3235
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.4047 - mean_absolute_error: 0.3453 - val_loss: 0.2216 - val_mean_absolute_error: 0.4022
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.4552 - mean_absolute_error: 0.4045 - val_loss: 0.2791 - val_mean_absolute_error: 0.4532
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.6590 - mean_absolute_error: 0.4022 - val_loss: 0.2963 - val_mean_absolute_error: 0.4703
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.1123 - mean_absolute_error: 0.2097 - val_loss: 0.3440 - val_mean_absolute_error: 0.4968
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.3592 - mean_absolute_error: 0.3844 - val_loss: 0.3474 - val_mean_absolute_error: 0.4939
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1520 - mean_absolute_error: 0.2946 - val_loss: 0.3221 - val_mean_absolute_error: 0.4672
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1418 - mean_absolute_error: 0.2357 - val_loss: 0.2568 - val_mean_absolute_error: 0.4147
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1257 - mean_absolute_error: 0.2080 - val_loss: 0.1905 - val_mean_absolute_error: 0.3552
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1542 - mean_absolute_error: 0.2706 - val_loss: 0.1630 - val_mean_absolute_error: 0.3229
Validation losses: [38.22005844116211, 7.822729587554932, 1.378354549407959, 0.3155701160430908, 0.16300755739212036]
HPS: {'player_emb_dim': 32, 'dense_units': 32, 'dense_units_2': 80, 'learning_rate': 0.0001, 'dropout_rate': 0.1}. MSE during RandomSearch: 3.2651822566986084. Starting evaluation across all k folds...

FOLD 1
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 2s 2s/step - loss: 4.3657 - mean_absolute_error: 1.5628 - val_loss: 33.0891 - val_mean_absolute_error: 4.4770
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 4.4150 - mean_absolute_error: 1.5916 - val_loss: 33.1115 - val_mean_absolute_error: 4.4812
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 4.5377 - mean_absolute_error: 1.5992
2025-08-09 14:13:26.603241: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:13:26.603825: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 4.5377 - mean_absolute_error: 1.5992 - val_loss: 33.1323 - val_mean_absolute_error: 4.4857
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 4.4617 - mean_absolute_error: 1.5714 - val_loss: 33.1528 - val_mean_absolute_error: 4.4900
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 4.0242 - mean_absolute_error: 1.4605 - val_loss: 33.1720 - val_mean_absolute_error: 4.4940
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 3.9238 - mean_absolute_error: 1.4708 - val_loss: 33.1835 - val_mean_absolute_error: 4.4973
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.9803 - mean_absolute_error: 1.5169 - val_loss: 33.1897 - val_mean_absolute_error: 4.5002
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 4.0945 - mean_absolute_error: 1.5260 - val_loss: 33.2049 - val_mean_absolute_error: 4.5037
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 4.0617 - mean_absolute_error: 1.4812 - val_loss: 33.2200 - val_mean_absolute_error: 4.5072
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 33ms/step - loss: 3.8155 - mean_absolute_error: 1.4459 - val_loss: 33.2347 - val_mean_absolute_error: 4.5105
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 3.7951 - mean_absolute_error: 1.4608 - val_loss: 33.2462 - val_mean_absolute_error: 4.5133
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.7761 - mean_absolute_error: 1.4585 - val_loss: 33.2542 - val_mean_absolute_error: 4.5157
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 3.6151 - mean_absolute_error: 1.4065 - val_loss: 33.2604 - val_mean_absolute_error: 4.5181
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 3.7157 - mean_absolute_error: 1.4623 - val_loss: 33.2611 - val_mean_absolute_error: 4.5197
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 3.6393 - mean_absolute_error: 1.4080 - val_loss: 33.2603 - val_mean_absolute_error: 4.5210
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 3.6948 - mean_absolute_error: 1.4176 - val_loss: 33.2601 - val_mean_absolute_error: 4.5221
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 3.5437 - mean_absolute_error: 1.3759 - val_loss: 33.2584 - val_mean_absolute_error: 4.5231
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 3.5469 - mean_absolute_error: 1.3695 - val_loss: 33.2589 - val_mean_absolute_error: 4.5242
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 3.4722 - mean_absolute_error: 1.3688 - val_loss: 33.2586 - val_mean_absolute_error: 4.5254
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 3.4935 - mean_absolute_error: 1.3844 - val_loss: 33.2610 - val_mean_absolute_error: 4.5266
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 3.5175 - mean_absolute_error: 1.4138 - val_loss: 33.2636 - val_mean_absolute_error: 4.5279
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 3.3786 - mean_absolute_error: 1.3418 - val_loss: 33.2647 - val_mean_absolute_error: 4.5288
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 3.4587 - mean_absolute_error: 1.3811 - val_loss: 33.2666 - val_mean_absolute_error: 4.5300
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.3344 - mean_absolute_error: 1.3486 - val_loss: 33.2665 - val_mean_absolute_error: 4.5311
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.0026 - mean_absolute_error: 1.2925 - val_loss: 33.2614 - val_mean_absolute_error: 4.5316
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 3.1955 - mean_absolute_error: 1.3187 - val_loss: 33.2542 - val_mean_absolute_error: 4.5320
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 3.3418 - mean_absolute_error: 1.3341 - val_loss: 33.2480 - val_mean_absolute_error: 4.5325
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.0014 - mean_absolute_error: 1.2576 - val_loss: 33.2420 - val_mean_absolute_error: 4.5329
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 3.5016 - mean_absolute_error: 1.4246 - val_loss: 33.2353 - val_mean_absolute_error: 4.5332
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 3.0227 - mean_absolute_error: 1.3307 - val_loss: 33.2281 - val_mean_absolute_error: 4.5335
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 3.0129 - mean_absolute_error: 1.2849 - val_loss: 33.2226 - val_mean_absolute_error: 4.5341
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.0646 - mean_absolute_error: 1.3073 - val_loss: 33.2180 - val_mean_absolute_error: 4.5347
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 3.1491 - mean_absolute_error: 1.3142 - val_loss: 33.2113 - val_mean_absolute_error: 4.5351
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.1739 - mean_absolute_error: 1.3262 - val_loss: 33.2059 - val_mean_absolute_error: 4.5355
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.2160 - mean_absolute_error: 1.3359 - val_loss: 33.2018 - val_mean_absolute_error: 4.5360
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 2.9406 - mean_absolute_error: 1.2878 - val_loss: 33.1971 - val_mean_absolute_error: 4.5365
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.1701 - mean_absolute_error: 1.3113 - val_loss: 33.1904 - val_mean_absolute_error: 4.5368
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.3640 - mean_absolute_error: 1.3450 - val_loss: 33.1872 - val_mean_absolute_error: 4.5373
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 2.7679 - mean_absolute_error: 1.2280 - val_loss: 33.1841 - val_mean_absolute_error: 4.5377
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 2.9665 - mean_absolute_error: 1.3170 - val_loss: 33.1772 - val_mean_absolute_error: 4.5379
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 124ms/step - loss: 2.8274 - mean_absolute_error: 1.2618 - val_loss: 33.1667 - val_mean_absolute_error: 4.5377
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 70ms/step - loss: 2.9599 - mean_absolute_error: 1.2987 - val_loss: 33.1556 - val_mean_absolute_error: 4.5375
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 2.8264 - mean_absolute_error: 1.2581 - val_loss: 33.1379 - val_mean_absolute_error: 4.5365
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 2.5540 - mean_absolute_error: 1.1575 - val_loss: 33.1187 - val_mean_absolute_error: 4.5355
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 90ms/step - loss: 2.6107 - mean_absolute_error: 1.2383 - val_loss: 33.1003 - val_mean_absolute_error: 4.5347
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 2.8667 - mean_absolute_error: 1.2413 - val_loss: 33.0840 - val_mean_absolute_error: 4.5340
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.8361 - mean_absolute_error: 1.2703 - val_loss: 33.0634 - val_mean_absolute_error: 4.5330
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.4526 - mean_absolute_error: 1.1542 - val_loss: 33.0428 - val_mean_absolute_error: 4.5319
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 97ms/step - loss: 2.8381 - mean_absolute_error: 1.2670 - val_loss: 33.0228 - val_mean_absolute_error: 4.5311
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.5771 - mean_absolute_error: 1.1756 - val_loss: 33.0039 - val_mean_absolute_error: 4.5304

FOLD 2
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 60ms/step - loss: 10.9026 - mean_absolute_error: 2.1145 - val_loss: 2.4090 - val_mean_absolute_error: 1.2465
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 10.3931 - mean_absolute_error: 1.9831 - val_loss: 2.3858 - val_mean_absolute_error: 1.2412
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 11.0660 - mean_absolute_error: 2.1018 - val_loss: 2.3630 - val_mean_absolute_error: 1.2363
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 10.6701 - mean_absolute_error: 2.0549 - val_loss: 2.3406 - val_mean_absolute_error: 1.2319
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 10.7289 - mean_absolute_error: 2.0794 - val_loss: 2.3178 - val_mean_absolute_error: 1.2283
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 10.0779 - mean_absolute_error: 1.9838 - val_loss: 2.2941 - val_mean_absolute_error: 1.2249
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 10.5893 - mean_absolute_error: 2.0306 - val_loss: 2.2707 - val_mean_absolute_error: 1.2216
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 10.0795 - mean_absolute_error: 2.0515 - val_loss: 2.2475 - val_mean_absolute_error: 1.2186
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 10.0362 - mean_absolute_error: 1.9633 - val_loss: 2.2244 - val_mean_absolute_error: 1.2156
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 10.2242 - mean_absolute_error: 1.9809 - val_loss: 2.2028 - val_mean_absolute_error: 1.2130
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 10.3665 - mean_absolute_error: 2.0598 - val_loss: 2.1816 - val_mean_absolute_error: 1.2105
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 9.5462 - mean_absolute_error: 1.9294 - val_loss: 2.1610 - val_mean_absolute_error: 1.2084
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 10.7837 - mean_absolute_error: 2.0941 - val_loss: 2.1417 - val_mean_absolute_error: 1.2064
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 10.1370 - mean_absolute_error: 1.9712 - val_loss: 2.1234 - val_mean_absolute_error: 1.2050
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 9.8061 - mean_absolute_error: 2.0353 - val_loss: 2.1060 - val_mean_absolute_error: 1.2037
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 9.9870 - mean_absolute_error: 1.9116 - val_loss: 2.0900 - val_mean_absolute_error: 1.2025
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 9.5234 - mean_absolute_error: 1.8970 - val_loss: 2.0726 - val_mean_absolute_error: 1.2012
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 9.4728 - mean_absolute_error: 1.9805 - val_loss: 2.0559 - val_mean_absolute_error: 1.2002
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 9.8424 - mean_absolute_error: 2.0782 - val_loss: 2.0391 - val_mean_absolute_error: 1.1989
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 9.7850 - mean_absolute_error: 2.0050 - val_loss: 2.0238 - val_mean_absolute_error: 1.1977
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 8.8721 - mean_absolute_error: 1.8754 - val_loss: 2.0084 - val_mean_absolute_error: 1.1964
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 9.3405 - mean_absolute_error: 2.0126 - val_loss: 1.9938 - val_mean_absolute_error: 1.1956
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 8.8677 - mean_absolute_error: 1.8446 - val_loss: 1.9785 - val_mean_absolute_error: 1.1944
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 8.9008 - mean_absolute_error: 1.8941 - val_loss: 1.9644 - val_mean_absolute_error: 1.1937
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 8.9470 - mean_absolute_error: 1.8739 - val_loss: 1.9514 - val_mean_absolute_error: 1.1929
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 8.8203 - mean_absolute_error: 1.9030 - val_loss: 1.9382 - val_mean_absolute_error: 1.1919
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 9.0298 - mean_absolute_error: 1.9375 - val_loss: 1.9239 - val_mean_absolute_error: 1.1904
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 9.3342 - mean_absolute_error: 1.9516 - val_loss: 1.9099 - val_mean_absolute_error: 1.1888
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 9.4766 - mean_absolute_error: 1.9615 - val_loss: 1.8957 - val_mean_absolute_error: 1.1872
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 8.7179 - mean_absolute_error: 1.8931 - val_loss: 1.8815 - val_mean_absolute_error: 1.1859
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 128ms/step - loss: 9.1615 - mean_absolute_error: 1.9057 - val_loss: 1.8680 - val_mean_absolute_error: 1.1845
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 85ms/step - loss: 8.6567 - mean_absolute_error: 1.9397 - val_loss: 1.8549 - val_mean_absolute_error: 1.1831
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - loss: 7.8795 - mean_absolute_error: 1.8716 - val_loss: 1.8423 - val_mean_absolute_error: 1.1821
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 8.7986 - mean_absolute_error: 1.8492 - val_loss: 1.8297 - val_mean_absolute_error: 1.1805
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - loss: 9.2379 - mean_absolute_error: 1.9734 - val_loss: 1.8165 - val_mean_absolute_error: 1.1783
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 8.8990 - mean_absolute_error: 1.9150 - val_loss: 1.8032 - val_mean_absolute_error: 1.1758
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 51ms/step - loss: 7.6263 - mean_absolute_error: 1.7888 - val_loss: 1.7898 - val_mean_absolute_error: 1.1731
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 7.8521 - mean_absolute_error: 1.8226 - val_loss: 1.7769 - val_mean_absolute_error: 1.1704
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 8.6256 - mean_absolute_error: 1.7671 - val_loss: 1.7644 - val_mean_absolute_error: 1.1676
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 8.5948 - mean_absolute_error: 1.8201 - val_loss: 1.7516 - val_mean_absolute_error: 1.1646
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 8.0620 - mean_absolute_error: 1.8495 - val_loss: 1.7393 - val_mean_absolute_error: 1.1616
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 7.8478 - mean_absolute_error: 1.8704 - val_loss: 1.7280 - val_mean_absolute_error: 1.1585
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 7.4293 - mean_absolute_error: 1.7493 - val_loss: 1.7179 - val_mean_absolute_error: 1.1557
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 7.6600 - mean_absolute_error: 1.8296 - val_loss: 1.7076 - val_mean_absolute_error: 1.1523
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 8.5692 - mean_absolute_error: 1.8842 - val_loss: 1.6973 - val_mean_absolute_error: 1.1488
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 8.3861 - mean_absolute_error: 1.9852 - val_loss: 1.6865 - val_mean_absolute_error: 1.1446
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 7.3456 - mean_absolute_error: 1.7409 - val_loss: 1.6768 - val_mean_absolute_error: 1.1404
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 8.1901 - mean_absolute_error: 1.8388 - val_loss: 1.6675 - val_mean_absolute_error: 1.1360
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 9.4873 - mean_absolute_error: 1.9556 - val_loss: 1.6577 - val_mean_absolute_error: 1.1312
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 56ms/step - loss: 6.7784 - mean_absolute_error: 1.6957 - val_loss: 1.6477 - val_mean_absolute_error: 1.1262

FOLD 3
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 79ms/step - loss: 7.4258 - mean_absolute_error: 1.8257 - val_loss: 2.7607 - val_mean_absolute_error: 1.1554
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 8.4377 - mean_absolute_error: 1.8342 - val_loss: 2.7622 - val_mean_absolute_error: 1.1539
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - loss: 6.8999 - mean_absolute_error: 1.7762 - val_loss: 2.7713 - val_mean_absolute_error: 1.1539
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 49ms/step - loss: 7.2308 - mean_absolute_error: 1.8618 - val_loss: 2.7889 - val_mean_absolute_error: 1.1556
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 6.2930 - mean_absolute_error: 1.6471 - val_loss: 2.8177 - val_mean_absolute_error: 1.1594
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - loss: 6.4222 - mean_absolute_error: 1.7078 - val_loss: 2.8534 - val_mean_absolute_error: 1.1646
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 7.4677 - mean_absolute_error: 1.6756 - val_loss: 2.8897 - val_mean_absolute_error: 1.1698
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 6.2185 - mean_absolute_error: 1.6280 - val_loss: 2.9322 - val_mean_absolute_error: 1.1762
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 5.2496 - mean_absolute_error: 1.5026 - val_loss: 2.9825 - val_mean_absolute_error: 1.1841
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 5.8354 - mean_absolute_error: 1.6654 - val_loss: 3.0384 - val_mean_absolute_error: 1.2019
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 7.4846 - mean_absolute_error: 1.8739 - val_loss: 3.0959 - val_mean_absolute_error: 1.2242
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 6.8126 - mean_absolute_error: 1.7079 - val_loss: 3.1554 - val_mean_absolute_error: 1.2461
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 6.5700 - mean_absolute_error: 1.6573 - val_loss: 3.2206 - val_mean_absolute_error: 1.2694
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 5.4910 - mean_absolute_error: 1.6086 - val_loss: 3.2868 - val_mean_absolute_error: 1.2920
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 6.2897 - mean_absolute_error: 1.6807 - val_loss: 3.3531 - val_mean_absolute_error: 1.3138
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 4.9064 - mean_absolute_error: 1.6077 - val_loss: 3.4268 - val_mean_absolute_error: 1.3378
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - loss: 5.0027 - mean_absolute_error: 1.5839 - val_loss: 3.5005 - val_mean_absolute_error: 1.3613
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 7.5721 - mean_absolute_error: 1.6816 - val_loss: 3.5728 - val_mean_absolute_error: 1.3838
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 4.4139 - mean_absolute_error: 1.4358 - val_loss: 3.6491 - val_mean_absolute_error: 1.4071
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 5.7527 - mean_absolute_error: 1.6185 - val_loss: 3.7304 - val_mean_absolute_error: 1.4316
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 5.3911 - mean_absolute_error: 1.5883 - val_loss: 3.8137 - val_mean_absolute_error: 1.4562
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 4.6337 - mean_absolute_error: 1.4715 - val_loss: 3.9000 - val_mean_absolute_error: 1.4813
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 5.1168 - mean_absolute_error: 1.5160 - val_loss: 3.9790 - val_mean_absolute_error: 1.5032
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 97ms/step - loss: 6.1963 - mean_absolute_error: 1.6051 - val_loss: 4.0563 - val_mean_absolute_error: 1.5239
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 4.5847 - mean_absolute_error: 1.5233 - val_loss: 4.1302 - val_mean_absolute_error: 1.5432
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 5.9601 - mean_absolute_error: 1.6530 - val_loss: 4.2028 - val_mean_absolute_error: 1.5620
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 4.7818 - mean_absolute_error: 1.5861 - val_loss: 4.2764 - val_mean_absolute_error: 1.5808
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 5.0297 - mean_absolute_error: 1.6265 - val_loss: 4.3460 - val_mean_absolute_error: 1.5980
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 4.6392 - mean_absolute_error: 1.4428 - val_loss: 4.4060 - val_mean_absolute_error: 1.6118
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 6.6756 - mean_absolute_error: 1.6356 - val_loss: 4.4584 - val_mean_absolute_error: 1.6233
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 4.2401 - mean_absolute_error: 1.4088 - val_loss: 4.5084 - val_mean_absolute_error: 1.6339
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 5.8128 - mean_absolute_error: 1.5788 - val_loss: 4.5497 - val_mean_absolute_error: 1.6423
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 4.3462 - mean_absolute_error: 1.4820 - val_loss: 4.5832 - val_mean_absolute_error: 1.6482
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.7414 - mean_absolute_error: 1.3473 - val_loss: 4.6207 - val_mean_absolute_error: 1.6555
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 4.4294 - mean_absolute_error: 1.5816 - val_loss: 4.6549 - val_mean_absolute_error: 1.6616
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 3.6597 - mean_absolute_error: 1.3160 - val_loss: 4.6950 - val_mean_absolute_error: 1.6695
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 3.8859 - mean_absolute_error: 1.3850 - val_loss: 4.7294 - val_mean_absolute_error: 1.6757
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 3.9456 - mean_absolute_error: 1.4543 - val_loss: 4.7597 - val_mean_absolute_error: 1.6805
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 4.3799 - mean_absolute_error: 1.4199 - val_loss: 4.7914 - val_mean_absolute_error: 1.6859
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 4.4734 - mean_absolute_error: 1.5508 - val_loss: 4.8254 - val_mean_absolute_error: 1.6917
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 3.2150 - mean_absolute_error: 1.3317 - val_loss: 4.8613 - val_mean_absolute_error: 1.6981
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 4.9015 - mean_absolute_error: 1.4760 - val_loss: 4.8899 - val_mean_absolute_error: 1.7023
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 3.5997 - mean_absolute_error: 1.4252 - val_loss: 4.9199 - val_mean_absolute_error: 1.7066
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 61ms/step - loss: 5.0112 - mean_absolute_error: 1.4497 - val_loss: 4.9559 - val_mean_absolute_error: 1.7129
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 3.2837 - mean_absolute_error: 1.4220 - val_loss: 5.0090 - val_mean_absolute_error: 1.7238
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 3.5987 - mean_absolute_error: 1.3292 - val_loss: 5.0567 - val_mean_absolute_error: 1.7331
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 3.6362 - mean_absolute_error: 1.2764 - val_loss: 5.1116 - val_mean_absolute_error: 1.7444
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 3.8686 - mean_absolute_error: 1.3058 - val_loss: 5.1611 - val_mean_absolute_error: 1.7544
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 3.9976 - mean_absolute_error: 1.3681 - val_loss: 5.2139 - val_mean_absolute_error: 1.7652
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 70ms/step - loss: 3.2963 - mean_absolute_error: 1.2865 - val_loss: 5.2659 - val_mean_absolute_error: 1.7760

FOLD 4
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 69ms/step - loss: 4.4347 - mean_absolute_error: 1.3723 - val_loss: 1.3006 - val_mean_absolute_error: 0.9942
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 3.9201 - mean_absolute_error: 1.5150 - val_loss: 1.2888 - val_mean_absolute_error: 0.9929
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 4.0146 - mean_absolute_error: 1.4633 - val_loss: 1.2782 - val_mean_absolute_error: 0.9921
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 13ms/step - loss: 5.3637 - mean_absolute_error: 1.5378
2025-08-09 14:13:33.390444: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:13:33.390915: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 5.3637 - mean_absolute_error: 1.5378 - val_loss: 1.2667 - val_mean_absolute_error: 0.9914
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 5.1284 - mean_absolute_error: 1.4733 - val_loss: 1.2557 - val_mean_absolute_error: 0.9910
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 4.6639 - mean_absolute_error: 1.6187 - val_loss: 1.2443 - val_mean_absolute_error: 0.9907
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 7.9201 - mean_absolute_error: 1.9436 - val_loss: 1.2371 - val_mean_absolute_error: 0.9925
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 98ms/step - loss: 4.0023 - mean_absolute_error: 1.3763 - val_loss: 1.2333 - val_mean_absolute_error: 0.9946
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 4.3877 - mean_absolute_error: 1.5040 - val_loss: 1.2338 - val_mean_absolute_error: 0.9963
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 3.8378 - mean_absolute_error: 1.4603 - val_loss: 1.2366 - val_mean_absolute_error: 0.9971
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 4.2044 - mean_absolute_error: 1.3936 - val_loss: 1.2461 - val_mean_absolute_error: 0.9989
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 3.6413 - mean_absolute_error: 1.2936 - val_loss: 1.2610 - val_mean_absolute_error: 1.0014
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 3.7212 - mean_absolute_error: 1.4381 - val_loss: 1.2807 - val_mean_absolute_error: 1.0046
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - loss: 3.4345 - mean_absolute_error: 1.3735 - val_loss: 1.3049 - val_mean_absolute_error: 1.0083
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 3.1775 - mean_absolute_error: 1.3082 - val_loss: 1.3323 - val_mean_absolute_error: 1.0122
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 4.2496 - mean_absolute_error: 1.3482 - val_loss: 1.3625 - val_mean_absolute_error: 1.0169
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 3.3097 - mean_absolute_error: 1.5440 - val_loss: 1.3933 - val_mean_absolute_error: 1.0228
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 3.3064 - mean_absolute_error: 1.3626 - val_loss: 1.4246 - val_mean_absolute_error: 1.0300
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 4.2013 - mean_absolute_error: 1.4672 - val_loss: 1.4566 - val_mean_absolute_error: 1.0384
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 3.5301 - mean_absolute_error: 1.4356 - val_loss: 1.4902 - val_mean_absolute_error: 1.0476
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 4.6633 - mean_absolute_error: 1.5777 - val_loss: 1.5239 - val_mean_absolute_error: 1.0569
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 3.8255 - mean_absolute_error: 1.3726 - val_loss: 1.5572 - val_mean_absolute_error: 1.0666
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.5293 - mean_absolute_error: 1.2489 - val_loss: 1.5879 - val_mean_absolute_error: 1.0762
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 3.2916 - mean_absolute_error: 1.1650 - val_loss: 1.6173 - val_mean_absolute_error: 1.0867
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 4.0489 - mean_absolute_error: 1.4883 - val_loss: 1.6452 - val_mean_absolute_error: 1.0986
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 3.5493 - mean_absolute_error: 1.2755 - val_loss: 1.6729 - val_mean_absolute_error: 1.1116
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.8877 - mean_absolute_error: 1.1073 - val_loss: 1.6997 - val_mean_absolute_error: 1.1240
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.2279 - mean_absolute_error: 1.1660 - val_loss: 1.7266 - val_mean_absolute_error: 1.1367
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.6998 - mean_absolute_error: 1.1133 - val_loss: 1.7529 - val_mean_absolute_error: 1.1495
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.3349 - mean_absolute_error: 1.3075 - val_loss: 1.7781 - val_mean_absolute_error: 1.1623
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 4.5987 - mean_absolute_error: 1.2933 - val_loss: 1.8045 - val_mean_absolute_error: 1.1749
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.6575 - mean_absolute_error: 1.0814 - val_loss: 1.8316 - val_mean_absolute_error: 1.1876
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 4.0237 - mean_absolute_error: 1.3666 - val_loss: 1.8601 - val_mean_absolute_error: 1.2014
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 2.3264 - mean_absolute_error: 1.2396 - val_loss: 1.8908 - val_mean_absolute_error: 1.2164
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.2349 - mean_absolute_error: 1.1785 - val_loss: 1.9216 - val_mean_absolute_error: 1.2309
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.3870 - mean_absolute_error: 1.1703 - val_loss: 1.9532 - val_mean_absolute_error: 1.2463
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 4.1576 - mean_absolute_error: 1.4545 - val_loss: 1.9863 - val_mean_absolute_error: 1.2619
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 4.3423 - mean_absolute_error: 1.3554 - val_loss: 2.0200 - val_mean_absolute_error: 1.2767
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.8451 - mean_absolute_error: 1.0352 - val_loss: 2.0546 - val_mean_absolute_error: 1.2910
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 2.0869 - mean_absolute_error: 1.0719 - val_loss: 2.0897 - val_mean_absolute_error: 1.3052
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.7485 - mean_absolute_error: 1.0580 - val_loss: 2.1278 - val_mean_absolute_error: 1.3210
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.7462 - mean_absolute_error: 1.0322 - val_loss: 2.1658 - val_mean_absolute_error: 1.3360
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.2629 - mean_absolute_error: 1.0796 - val_loss: 2.2010 - val_mean_absolute_error: 1.3485
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.9735 - mean_absolute_error: 1.0462 - val_loss: 2.2379 - val_mean_absolute_error: 1.3615
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 4.2574 - mean_absolute_error: 1.3958 - val_loss: 2.2726 - val_mean_absolute_error: 1.3735
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.5363 - mean_absolute_error: 0.9484 - val_loss: 2.3062 - val_mean_absolute_error: 1.3847
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.9175 - mean_absolute_error: 1.0416 - val_loss: 2.3389 - val_mean_absolute_error: 1.3949
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.7737 - mean_absolute_error: 0.9928 - val_loss: 2.3705 - val_mean_absolute_error: 1.4049
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.8028 - mean_absolute_error: 1.0015 - val_loss: 2.4011 - val_mean_absolute_error: 1.4142
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 70ms/step - loss: 1.6134 - mean_absolute_error: 0.9785 - val_loss: 2.4301 - val_mean_absolute_error: 1.4227

FOLD 5
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 66ms/step - loss: 2.0899 - mean_absolute_error: 1.1882 - val_loss: 0.2586 - val_mean_absolute_error: 0.4774
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 2.0041 - mean_absolute_error: 1.1399 - val_loss: 0.2575 - val_mean_absolute_error: 0.4785
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 2.5257 - mean_absolute_error: 1.2217 - val_loss: 0.2632 - val_mean_absolute_error: 0.4876
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - loss: 2.0520 - mean_absolute_error: 1.1625 - val_loss: 0.2686 - val_mean_absolute_error: 0.4953
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step - loss: 2.5700 - mean_absolute_error: 1.2557 - val_loss: 0.2769 - val_mean_absolute_error: 0.5057
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 2.6881 - mean_absolute_error: 1.2142 - val_loss: 0.2852 - val_mean_absolute_error: 0.5154
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 2.6325 - mean_absolute_error: 1.2847 - val_loss: 0.2950 - val_mean_absolute_error: 0.5263
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 2.7843 - mean_absolute_error: 1.2534 - val_loss: 0.3068 - val_mean_absolute_error: 0.5386
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.7481 - mean_absolute_error: 1.0191 - val_loss: 0.3206 - val_mean_absolute_error: 0.5527
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 51ms/step - loss: 2.4472 - mean_absolute_error: 1.2212 - val_loss: 0.3273 - val_mean_absolute_error: 0.5590
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 54ms/step - loss: 2.6478 - mean_absolute_error: 1.2391 - val_loss: 0.3344 - val_mean_absolute_error: 0.5656
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 54ms/step - loss: 4.5920 - mean_absolute_error: 1.4894 - val_loss: 0.3403 - val_mean_absolute_error: 0.5709
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 58ms/step - loss: 1.8569 - mean_absolute_error: 1.0845 - val_loss: 0.3448 - val_mean_absolute_error: 0.5747
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - loss: 2.6415 - mean_absolute_error: 1.2110 - val_loss: 0.3487 - val_mean_absolute_error: 0.5779
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 1.3244 - mean_absolute_error: 0.9088 - val_loss: 0.3506 - val_mean_absolute_error: 0.5791
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.9800 - mean_absolute_error: 1.1701 - val_loss: 0.3493 - val_mean_absolute_error: 0.5771
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 1.4813 - mean_absolute_error: 0.9935 - val_loss: 0.3470 - val_mean_absolute_error: 0.5739
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 2.5469 - mean_absolute_error: 1.2671 - val_loss: 0.3426 - val_mean_absolute_error: 0.5683
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.3500 - mean_absolute_error: 0.8421 - val_loss: 0.3394 - val_mean_absolute_error: 0.5637
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 94ms/step - loss: 1.5652 - mean_absolute_error: 1.0006 - val_loss: 0.3391 - val_mean_absolute_error: 0.5619
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 2.2029 - mean_absolute_error: 1.2110 - val_loss: 0.3396 - val_mean_absolute_error: 0.5607
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.3436 - mean_absolute_error: 0.9187 - val_loss: 0.3406 - val_mean_absolute_error: 0.5602
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.2330 - mean_absolute_error: 0.9187 - val_loss: 0.3450 - val_mean_absolute_error: 0.5633
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.4572 - mean_absolute_error: 0.9561 - val_loss: 0.3507 - val_mean_absolute_error: 0.5680
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.7394 - mean_absolute_error: 1.0790 - val_loss: 0.3576 - val_mean_absolute_error: 0.5742
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 5.1479 - mean_absolute_error: 1.4416 - val_loss: 0.3639 - val_mean_absolute_error: 0.5798
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.6225 - mean_absolute_error: 0.9714 - val_loss: 0.3741 - val_mean_absolute_error: 0.5896
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.4822 - mean_absolute_error: 1.0295 - val_loss: 0.3839 - val_mean_absolute_error: 0.5989
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.5367 - mean_absolute_error: 1.2170 - val_loss: 0.3970 - val_mean_absolute_error: 0.6115
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.2895 - mean_absolute_error: 0.8362 - val_loss: 0.4108 - val_mean_absolute_error: 0.6242
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.5546 - mean_absolute_error: 1.0609 - val_loss: 0.4289 - val_mean_absolute_error: 0.6404
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.9712 - mean_absolute_error: 0.7511 - val_loss: 0.4455 - val_mean_absolute_error: 0.6546
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.9599 - mean_absolute_error: 0.7675 - val_loss: 0.4632 - val_mean_absolute_error: 0.6694
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.2222 - mean_absolute_error: 0.8631 - val_loss: 0.4772 - val_mean_absolute_error: 0.6806
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.6525 - mean_absolute_error: 1.0327 - val_loss: 0.4919 - val_mean_absolute_error: 0.6919
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.4081 - mean_absolute_error: 0.9683 - val_loss: 0.5096 - val_mean_absolute_error: 0.7052
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.3853 - mean_absolute_error: 0.9869 - val_loss: 0.5234 - val_mean_absolute_error: 0.7153
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.0462 - mean_absolute_error: 0.8565 - val_loss: 0.5349 - val_mean_absolute_error: 0.7235
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.0096 - mean_absolute_error: 0.7907 - val_loss: 0.5453 - val_mean_absolute_error: 0.7308
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.3608 - mean_absolute_error: 0.9717 - val_loss: 0.5550 - val_mean_absolute_error: 0.7375
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.7841 - mean_absolute_error: 0.6951 - val_loss: 0.5666 - val_mean_absolute_error: 0.7455
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.1367 - mean_absolute_error: 1.1339 - val_loss: 0.5733 - val_mean_absolute_error: 0.7500
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.5950 - mean_absolute_error: 0.9307 - val_loss: 0.5723 - val_mean_absolute_error: 0.7492
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.9617 - mean_absolute_error: 0.6974 - val_loss: 0.5724 - val_mean_absolute_error: 0.7490
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.2620 - mean_absolute_error: 1.0821 - val_loss: 0.5636 - val_mean_absolute_error: 0.7427
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.6445 - mean_absolute_error: 0.9018 - val_loss: 0.5538 - val_mean_absolute_error: 0.7355
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.7788 - mean_absolute_error: 0.6699 - val_loss: 0.5485 - val_mean_absolute_error: 0.7314
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.8820 - mean_absolute_error: 0.6803 - val_loss: 0.5457 - val_mean_absolute_error: 0.7291
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.9708 - mean_absolute_error: 0.7844 - val_loss: 0.5420 - val_mean_absolute_error: 0.7263
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.0144 - mean_absolute_error: 0.7880 - val_loss: 0.5447 - val_mean_absolute_error: 0.7281
Validation losses: [33.00392532348633, 1.6476508378982544, 5.265861988067627, 2.4301352500915527, 0.5446701645851135]
HPS: {'player_emb_dim': 8, 'dense_units': 96, 'dense_units_2': 80, 'learning_rate': 0.001, 'dropout_rate': 0.1}. MSE during RandomSearch: 1.556578278541565. Starting evaluation across all k folds...

FOLD 1
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 2s 2s/step - loss: 3.6867 - mean_absolute_error: 1.4570 - val_loss: 36.6881 - val_mean_absolute_error: 4.7232
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 3.4456 - mean_absolute_error: 1.3862
2025-08-09 14:13:39.427007: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:13:39.427325: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 3.4456 - mean_absolute_error: 1.3862 - val_loss: 36.9404 - val_mean_absolute_error: 4.7297
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 3.5012 - mean_absolute_error: 1.4076 - val_loss: 37.2462 - val_mean_absolute_error: 4.7373
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.9703 - mean_absolute_error: 1.2883 - val_loss: 37.5173 - val_mean_absolute_error: 4.7420
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.9488 - mean_absolute_error: 1.2905 - val_loss: 37.8220 - val_mean_absolute_error: 4.7492
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 2.7308 - mean_absolute_error: 1.2326 - val_loss: 38.0959 - val_mean_absolute_error: 4.7544
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 2.7789 - mean_absolute_error: 1.2056 - val_loss: 38.3720 - val_mean_absolute_error: 4.7591
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 2.4342 - mean_absolute_error: 1.1614 - val_loss: 38.6563 - val_mean_absolute_error: 4.7627
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 2.5497 - mean_absolute_error: 1.2038 - val_loss: 38.9800 - val_mean_absolute_error: 4.7690
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 2.0551 - mean_absolute_error: 1.0984 - val_loss: 39.3358 - val_mean_absolute_error: 4.7772
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 2.0942 - mean_absolute_error: 1.0957 - val_loss: 39.7008 - val_mean_absolute_error: 4.7839
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 2.0827 - mean_absolute_error: 1.1058 - val_loss: 40.0624 - val_mean_absolute_error: 4.7885
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.1982 - mean_absolute_error: 1.1234 - val_loss: 40.4286 - val_mean_absolute_error: 4.7925
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 1.8671 - mean_absolute_error: 1.0840 - val_loss: 40.8147 - val_mean_absolute_error: 4.7985
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.7861 - mean_absolute_error: 1.0828 - val_loss: 41.2107 - val_mean_absolute_error: 4.8194
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.5776 - mean_absolute_error: 1.0166 - val_loss: 41.6266 - val_mean_absolute_error: 4.8569
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.4334 - mean_absolute_error: 0.9257 - val_loss: 42.0432 - val_mean_absolute_error: 4.8937
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.3774 - mean_absolute_error: 0.9407 - val_loss: 42.3579 - val_mean_absolute_error: 4.9304
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 1.1388 - mean_absolute_error: 0.8954 - val_loss: 42.6326 - val_mean_absolute_error: 4.9659
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.2326 - mean_absolute_error: 0.9347 - val_loss: 42.8714 - val_mean_absolute_error: 4.9967
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.9436 - mean_absolute_error: 0.8383 - val_loss: 43.0977 - val_mean_absolute_error: 5.0221
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 1.0464 - mean_absolute_error: 0.8841 - val_loss: 43.3287 - val_mean_absolute_error: 5.0416
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.8898 - mean_absolute_error: 0.8158 - val_loss: 43.5566 - val_mean_absolute_error: 5.0535
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.8751 - mean_absolute_error: 0.7844 - val_loss: 43.7815 - val_mean_absolute_error: 5.0608
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.0252 - mean_absolute_error: 0.8736 - val_loss: 43.9817 - val_mean_absolute_error: 5.0658
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.8092 - mean_absolute_error: 0.7436 - val_loss: 44.1193 - val_mean_absolute_error: 5.0669
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.6348 - mean_absolute_error: 0.6806 - val_loss: 44.0969 - val_mean_absolute_error: 5.0633
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.6782 - mean_absolute_error: 0.6784 - val_loss: 44.0332 - val_mean_absolute_error: 5.0575
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.5246 - mean_absolute_error: 0.6302 - val_loss: 43.8370 - val_mean_absolute_error: 5.0483
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.4313 - mean_absolute_error: 0.5388 - val_loss: 43.4408 - val_mean_absolute_error: 5.0364
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.5795 - mean_absolute_error: 0.6201 - val_loss: 43.1375 - val_mean_absolute_error: 5.0244
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.4635 - mean_absolute_error: 0.5883 - val_loss: 42.8650 - val_mean_absolute_error: 5.0177
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.4453 - mean_absolute_error: 0.4970 - val_loss: 42.6270 - val_mean_absolute_error: 5.0110
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.3139 - mean_absolute_error: 0.4937 - val_loss: 42.4393 - val_mean_absolute_error: 5.0035
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.3186 - mean_absolute_error: 0.4256 - val_loss: 42.2767 - val_mean_absolute_error: 4.9903
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.3243 - mean_absolute_error: 0.4689 - val_loss: 42.3130 - val_mean_absolute_error: 4.9791
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.1886 - mean_absolute_error: 0.3421 - val_loss: 42.2600 - val_mean_absolute_error: 4.9619
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.1338 - mean_absolute_error: 0.2858 - val_loss: 42.2281 - val_mean_absolute_error: 4.9471
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.2720 - mean_absolute_error: 0.3577 - val_loss: 42.3434 - val_mean_absolute_error: 4.9339
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1388 - mean_absolute_error: 0.2506 - val_loss: 42.4057 - val_mean_absolute_error: 4.9275
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.1968 - mean_absolute_error: 0.3305 - val_loss: 42.3804 - val_mean_absolute_error: 4.9214
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1581 - mean_absolute_error: 0.3061 - val_loss: 42.5566 - val_mean_absolute_error: 4.9208
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2009 - mean_absolute_error: 0.3538 - val_loss: 42.6688 - val_mean_absolute_error: 4.9222
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.1825 - mean_absolute_error: 0.3300 - val_loss: 42.7581 - val_mean_absolute_error: 4.9255
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.2870 - mean_absolute_error: 0.3827 - val_loss: 42.5925 - val_mean_absolute_error: 4.9274
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.1295 - mean_absolute_error: 0.2542 - val_loss: 42.5250 - val_mean_absolute_error: 4.9295
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1236 - mean_absolute_error: 0.2763 - val_loss: 42.4835 - val_mean_absolute_error: 4.9333
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.0890 - mean_absolute_error: 0.2090 - val_loss: 42.6173 - val_mean_absolute_error: 4.9395
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1269 - mean_absolute_error: 0.2794 - val_loss: 42.8473 - val_mean_absolute_error: 4.9517
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2121 - mean_absolute_error: 0.2746 - val_loss: 43.1422 - val_mean_absolute_error: 4.9649

FOLD 2
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 60ms/step - loss: 11.8428 - mean_absolute_error: 1.6369 - val_loss: 0.0220 - val_mean_absolute_error: 0.0863
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 11.6473 - mean_absolute_error: 1.5677 - val_loss: 0.0339 - val_mean_absolute_error: 0.1306
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 10.9279 - mean_absolute_error: 1.5129 - val_loss: 0.0699 - val_mean_absolute_error: 0.2236
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 11.0615 - mean_absolute_error: 1.5453 - val_loss: 0.1397 - val_mean_absolute_error: 0.3357
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 9.6919 - mean_absolute_error: 1.4859 - val_loss: 0.2279 - val_mean_absolute_error: 0.4379
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 9.7491 - mean_absolute_error: 1.5190 - val_loss: 0.3258 - val_mean_absolute_error: 0.5291
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 9.2714 - mean_absolute_error: 1.5316 - val_loss: 0.4123 - val_mean_absolute_error: 0.5995
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 8.6849 - mean_absolute_error: 1.4448 - val_loss: 0.4994 - val_mean_absolute_error: 0.6615
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 8.3863 - mean_absolute_error: 1.5716 - val_loss: 0.5675 - val_mean_absolute_error: 0.7036
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 8.8507 - mean_absolute_error: 1.5445 - val_loss: 0.6083 - val_mean_absolute_error: 0.7266
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 8.8291 - mean_absolute_error: 1.5617 - val_loss: 0.6186 - val_mean_absolute_error: 0.7272
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 8.0749 - mean_absolute_error: 1.5802 - val_loss: 0.6010 - val_mean_absolute_error: 0.7048
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 7.4979 - mean_absolute_error: 1.5043 - val_loss: 0.5764 - val_mean_absolute_error: 0.6710
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 7.6313 - mean_absolute_error: 1.5128 - val_loss: 0.5432 - val_mean_absolute_error: 0.6219
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 7.0728 - mean_absolute_error: 1.5142 - val_loss: 0.5050 - val_mean_absolute_error: 0.5546
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 7.1657 - mean_absolute_error: 1.5067 - val_loss: 0.4749 - val_mean_absolute_error: 0.5196
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 6.3742 - mean_absolute_error: 1.2537 - val_loss: 0.4678 - val_mean_absolute_error: 0.5206
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 6.6121 - mean_absolute_error: 1.3925 - val_loss: 0.4825 - val_mean_absolute_error: 0.5439
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 6.5106 - mean_absolute_error: 1.3961 - val_loss: 0.5280 - val_mean_absolute_error: 0.6161
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 5.8109 - mean_absolute_error: 1.4161 - val_loss: 0.5855 - val_mean_absolute_error: 0.6716
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 4.8975 - mean_absolute_error: 1.1123 - val_loss: 0.6545 - val_mean_absolute_error: 0.7232
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 5.5454 - mean_absolute_error: 1.4411 - val_loss: 0.7230 - val_mean_absolute_error: 0.7666
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 4.7768 - mean_absolute_error: 1.2197 - val_loss: 0.7922 - val_mean_absolute_error: 0.8033
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 4.9901 - mean_absolute_error: 1.3298 - val_loss: 0.8466 - val_mean_absolute_error: 0.8261
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 4.4421 - mean_absolute_error: 1.2964 - val_loss: 0.8843 - val_mean_absolute_error: 0.8347
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 90ms/step - loss: 4.0804 - mean_absolute_error: 1.1695 - val_loss: 0.9148 - val_mean_absolute_error: 0.8364
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 3.7166 - mean_absolute_error: 1.2192 - val_loss: 0.9455 - val_mean_absolute_error: 0.8332
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 3.4457 - mean_absolute_error: 1.1984 - val_loss: 0.9777 - val_mean_absolute_error: 0.8346
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 3.7627 - mean_absolute_error: 1.1588 - val_loss: 1.0297 - val_mean_absolute_error: 0.8599
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.7860 - mean_absolute_error: 0.9802 - val_loss: 1.0766 - val_mean_absolute_error: 0.8714
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.5993 - mean_absolute_error: 1.0791 - val_loss: 1.1295 - val_mean_absolute_error: 0.8896
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 3.1027 - mean_absolute_error: 1.2416 - val_loss: 1.1713 - val_mean_absolute_error: 0.9066
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.0084 - mean_absolute_error: 0.9929 - val_loss: 1.2051 - val_mean_absolute_error: 0.9159
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.9668 - mean_absolute_error: 1.0322 - val_loss: 1.2311 - val_mean_absolute_error: 0.9355
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.0131 - mean_absolute_error: 0.7518 - val_loss: 1.2567 - val_mean_absolute_error: 0.9349
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.9647 - mean_absolute_error: 0.9043 - val_loss: 1.2781 - val_mean_absolute_error: 0.9299
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.3402 - mean_absolute_error: 0.8003 - val_loss: 1.3301 - val_mean_absolute_error: 0.9335
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.9724 - mean_absolute_error: 0.6291 - val_loss: 1.4144 - val_mean_absolute_error: 0.9949
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.1303 - mean_absolute_error: 0.8294 - val_loss: 1.4670 - val_mean_absolute_error: 1.0225
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.6379 - mean_absolute_error: 0.5982 - val_loss: 1.5165 - val_mean_absolute_error: 1.0345
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.7387 - mean_absolute_error: 0.6944 - val_loss: 1.5601 - val_mean_absolute_error: 1.0684
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.8250 - mean_absolute_error: 0.6406 - val_loss: 1.5824 - val_mean_absolute_error: 1.0677
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.8392 - mean_absolute_error: 0.8020 - val_loss: 1.5891 - val_mean_absolute_error: 1.0823
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.9139 - mean_absolute_error: 0.8333 - val_loss: 1.5084 - val_mean_absolute_error: 1.0269
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.6486 - mean_absolute_error: 0.6166 - val_loss: 1.4294 - val_mean_absolute_error: 1.0104
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.5088 - mean_absolute_error: 0.5614 - val_loss: 1.3668 - val_mean_absolute_error: 1.0269
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.7571 - mean_absolute_error: 0.6550 - val_loss: 1.3487 - val_mean_absolute_error: 1.0535
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.9211 - mean_absolute_error: 0.6944 - val_loss: 1.2967 - val_mean_absolute_error: 1.0343
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.1578 - mean_absolute_error: 0.8644 - val_loss: 1.2309 - val_mean_absolute_error: 0.9996
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.4418 - mean_absolute_error: 0.5445 - val_loss: 1.1464 - val_mean_absolute_error: 0.9152

FOLD 3
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 57ms/step - loss: 0.9666 - mean_absolute_error: 0.8108 - val_loss: 0.4887 - val_mean_absolute_error: 0.5912
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.8207 - mean_absolute_error: 0.6871 - val_loss: 0.7292 - val_mean_absolute_error: 0.7917
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.7475 - mean_absolute_error: 0.6924 - val_loss: 0.7717 - val_mean_absolute_error: 0.8285
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.2674 - mean_absolute_error: 0.7613 - val_loss: 0.5045 - val_mean_absolute_error: 0.6487
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.4719 - mean_absolute_error: 0.6238 - val_loss: 0.2585 - val_mean_absolute_error: 0.4185
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.4013 - mean_absolute_error: 0.4982 - val_loss: 0.1189 - val_mean_absolute_error: 0.2650
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.6448 - mean_absolute_error: 0.6768 - val_loss: 0.0862 - val_mean_absolute_error: 0.2588
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.8324 - mean_absolute_error: 0.8289 - val_loss: 0.0805 - val_mean_absolute_error: 0.2432
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.6303 - mean_absolute_error: 0.6136 - val_loss: 0.0724 - val_mean_absolute_error: 0.2193
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.5860 - mean_absolute_error: 0.6617 - val_loss: 0.0563 - val_mean_absolute_error: 0.1947
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.4165 - mean_absolute_error: 0.5117 - val_loss: 0.0615 - val_mean_absolute_error: 0.1652
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.3407 - mean_absolute_error: 0.4965 - val_loss: 0.0708 - val_mean_absolute_error: 0.1842
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.5270 - mean_absolute_error: 0.6154 - val_loss: 0.0683 - val_mean_absolute_error: 0.1832
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.3634 - mean_absolute_error: 0.5047 - val_loss: 0.0714 - val_mean_absolute_error: 0.1975
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3863 - mean_absolute_error: 0.4973 - val_loss: 0.0681 - val_mean_absolute_error: 0.2023
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3333 - mean_absolute_error: 0.4775 - val_loss: 0.0651 - val_mean_absolute_error: 0.1929
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.5535 - mean_absolute_error: 0.5039 - val_loss: 0.0742 - val_mean_absolute_error: 0.1809
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.5124 - mean_absolute_error: 0.6408 - val_loss: 0.0892 - val_mean_absolute_error: 0.2160
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.3607 - mean_absolute_error: 0.4866 - val_loss: 0.1158 - val_mean_absolute_error: 0.2756
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1339 - mean_absolute_error: 0.2790 - val_loss: 0.1530 - val_mean_absolute_error: 0.3391
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2285 - mean_absolute_error: 0.3934 - val_loss: 0.2055 - val_mean_absolute_error: 0.4125
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3527 - mean_absolute_error: 0.5134 - val_loss: 0.2075 - val_mean_absolute_error: 0.4112
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3744 - mean_absolute_error: 0.4571 - val_loss: 0.1858 - val_mean_absolute_error: 0.3574
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.1299 - mean_absolute_error: 0.8021 - val_loss: 0.1911 - val_mean_absolute_error: 0.3304
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.4473 - mean_absolute_error: 0.5835 - val_loss: 0.2057 - val_mean_absolute_error: 0.3223
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.3310 - mean_absolute_error: 0.4070 - val_loss: 0.2284 - val_mean_absolute_error: 0.3173
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.6238 - mean_absolute_error: 0.5758 - val_loss: 0.2604 - val_mean_absolute_error: 0.3352
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.3352 - mean_absolute_error: 0.4593 - val_loss: 0.2633 - val_mean_absolute_error: 0.3222
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.3043 - mean_absolute_error: 0.4159 - val_loss: 0.2502 - val_mean_absolute_error: 0.3215
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.4044 - mean_absolute_error: 0.5496 - val_loss: 0.2354 - val_mean_absolute_error: 0.3207
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2816 - mean_absolute_error: 0.3581 - val_loss: 0.2312 - val_mean_absolute_error: 0.3062
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.6812 - mean_absolute_error: 0.6690 - val_loss: 0.2189 - val_mean_absolute_error: 0.3032
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.9522 - mean_absolute_error: 0.5824 - val_loss: 0.1965 - val_mean_absolute_error: 0.3157
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.2192 - mean_absolute_error: 0.3802 - val_loss: 0.1864 - val_mean_absolute_error: 0.3267
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 84ms/step - loss: 0.4365 - mean_absolute_error: 0.5186 - val_loss: 0.2007 - val_mean_absolute_error: 0.3649
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.5736 - mean_absolute_error: 0.3903 - val_loss: 0.1973 - val_mean_absolute_error: 0.3700
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1709 - mean_absolute_error: 0.3122 - val_loss: 0.2049 - val_mean_absolute_error: 0.3932
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.2501 - mean_absolute_error: 0.4138 - val_loss: 0.1912 - val_mean_absolute_error: 0.3788
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2454 - mean_absolute_error: 0.3681 - val_loss: 0.1718 - val_mean_absolute_error: 0.3537
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1744 - mean_absolute_error: 0.3274 - val_loss: 0.1486 - val_mean_absolute_error: 0.3119
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2791 - mean_absolute_error: 0.3599 - val_loss: 0.1246 - val_mean_absolute_error: 0.2894
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2788 - mean_absolute_error: 0.3693 - val_loss: 0.1202 - val_mean_absolute_error: 0.2823
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 51ms/step - loss: 0.3355 - mean_absolute_error: 0.4578 - val_loss: 0.1194 - val_mean_absolute_error: 0.2816
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 53ms/step - loss: 0.2652 - mean_absolute_error: 0.4497 - val_loss: 0.1193 - val_mean_absolute_error: 0.2795
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.2508 - mean_absolute_error: 0.4017 - val_loss: 0.1221 - val_mean_absolute_error: 0.2882
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2735 - mean_absolute_error: 0.3537 - val_loss: 0.1257 - val_mean_absolute_error: 0.2958
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.4598 - mean_absolute_error: 0.5190 - val_loss: 0.1262 - val_mean_absolute_error: 0.2925
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.6183 - mean_absolute_error: 0.5555 - val_loss: 0.1270 - val_mean_absolute_error: 0.2870
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.4389 - mean_absolute_error: 0.4466 - val_loss: 0.1294 - val_mean_absolute_error: 0.2874
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.1866 - mean_absolute_error: 0.3364 - val_loss: 0.1314 - val_mean_absolute_error: 0.2847

FOLD 4
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 57ms/step - loss: 0.1366 - mean_absolute_error: 0.2917 - val_loss: 0.0629 - val_mean_absolute_error: 0.1746
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.3104 - mean_absolute_error: 0.3535 - val_loss: 0.0563 - val_mean_absolute_error: 0.1656
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.4068 - mean_absolute_error: 0.4788 - val_loss: 0.0439 - val_mean_absolute_error: 0.1485
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2694 - mean_absolute_error: 0.3655 - val_loss: 0.0343 - val_mean_absolute_error: 0.1361
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 10ms/step - loss: 0.2618 - mean_absolute_error: 0.4327
2025-08-09 14:13:45.389167: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:13:45.389514: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.2618 - mean_absolute_error: 0.4327 - val_loss: 0.0371 - val_mean_absolute_error: 0.1279
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1311 - mean_absolute_error: 0.2835 - val_loss: 0.0441 - val_mean_absolute_error: 0.1498
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.3074 - mean_absolute_error: 0.4522 - val_loss: 0.0509 - val_mean_absolute_error: 0.1703
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2433 - mean_absolute_error: 0.3685 - val_loss: 0.0356 - val_mean_absolute_error: 0.1338
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 98ms/step - loss: 0.2075 - mean_absolute_error: 0.3779 - val_loss: 0.0277 - val_mean_absolute_error: 0.1040
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 149ms/step - loss: 0.1814 - mean_absolute_error: 0.3763 - val_loss: 0.0219 - val_mean_absolute_error: 0.0718
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.1219 - mean_absolute_error: 0.2874 - val_loss: 0.0202 - val_mean_absolute_error: 0.0642
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3942 - mean_absolute_error: 0.4905 - val_loss: 0.0341 - val_mean_absolute_error: 0.1342
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2787 - mean_absolute_error: 0.3655 - val_loss: 0.0516 - val_mean_absolute_error: 0.1677
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3643 - mean_absolute_error: 0.4453 - val_loss: 0.0563 - val_mean_absolute_error: 0.1767
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.2867 - mean_absolute_error: 0.3676 - val_loss: 0.0464 - val_mean_absolute_error: 0.1573
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2602 - mean_absolute_error: 0.3751 - val_loss: 0.0312 - val_mean_absolute_error: 0.1267
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2610 - mean_absolute_error: 0.4392 - val_loss: 0.0205 - val_mean_absolute_error: 0.0751
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1852 - mean_absolute_error: 0.3206 - val_loss: 0.0232 - val_mean_absolute_error: 0.0728
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.1620 - mean_absolute_error: 0.3204 - val_loss: 0.0275 - val_mean_absolute_error: 0.0968
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.0804 - mean_absolute_error: 0.2239 - val_loss: 0.0271 - val_mean_absolute_error: 0.0991
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.3149 - mean_absolute_error: 0.3554 - val_loss: 0.0211 - val_mean_absolute_error: 0.0719
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.3041 - mean_absolute_error: 0.3992 - val_loss: 0.0224 - val_mean_absolute_error: 0.0939
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1237 - mean_absolute_error: 0.2645 - val_loss: 0.0301 - val_mean_absolute_error: 0.1208
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.3550 - mean_absolute_error: 0.4394 - val_loss: 0.0402 - val_mean_absolute_error: 0.1444
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1791 - mean_absolute_error: 0.3083 - val_loss: 0.0429 - val_mean_absolute_error: 0.1487
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.5107 - mean_absolute_error: 0.5079 - val_loss: 0.0533 - val_mean_absolute_error: 0.1752
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2264 - mean_absolute_error: 0.3404 - val_loss: 0.0550 - val_mean_absolute_error: 0.1796
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1552 - mean_absolute_error: 0.2989 - val_loss: 0.0700 - val_mean_absolute_error: 0.2163
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1538 - mean_absolute_error: 0.3323 - val_loss: 0.0680 - val_mean_absolute_error: 0.2099
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.1766 - mean_absolute_error: 0.3372 - val_loss: 0.0645 - val_mean_absolute_error: 0.1991
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.1978 - mean_absolute_error: 0.3133 - val_loss: 0.0686 - val_mean_absolute_error: 0.2028
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.1708 - mean_absolute_error: 0.3213 - val_loss: 0.0694 - val_mean_absolute_error: 0.2062
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1574 - mean_absolute_error: 0.3078 - val_loss: 0.0772 - val_mean_absolute_error: 0.2274
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.1013 - mean_absolute_error: 0.2540 - val_loss: 0.0935 - val_mean_absolute_error: 0.2567
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2052 - mean_absolute_error: 0.3308 - val_loss: 0.1263 - val_mean_absolute_error: 0.2999
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3402 - mean_absolute_error: 0.4726 - val_loss: 0.1302 - val_mean_absolute_error: 0.3105
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.5979 - mean_absolute_error: 0.6089 - val_loss: 0.1257 - val_mean_absolute_error: 0.3101
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.4652 - mean_absolute_error: 0.4019 - val_loss: 0.1034 - val_mean_absolute_error: 0.2877
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2274 - mean_absolute_error: 0.3355 - val_loss: 0.0849 - val_mean_absolute_error: 0.2525
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1704 - mean_absolute_error: 0.2916 - val_loss: 0.0826 - val_mean_absolute_error: 0.2373
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.2139 - mean_absolute_error: 0.3621 - val_loss: 0.0809 - val_mean_absolute_error: 0.2404
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 54ms/step - loss: 0.2027 - mean_absolute_error: 0.3631 - val_loss: 0.0795 - val_mean_absolute_error: 0.2444
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.2807 - mean_absolute_error: 0.3391 - val_loss: 0.0787 - val_mean_absolute_error: 0.2390
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.2603 - mean_absolute_error: 0.3643 - val_loss: 0.0750 - val_mean_absolute_error: 0.2287
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.2014 - mean_absolute_error: 0.3414 - val_loss: 0.0695 - val_mean_absolute_error: 0.2202
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1442 - mean_absolute_error: 0.2585 - val_loss: 0.0638 - val_mean_absolute_error: 0.1953
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1414 - mean_absolute_error: 0.3130 - val_loss: 0.0619 - val_mean_absolute_error: 0.1810
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.1605 - mean_absolute_error: 0.2837 - val_loss: 0.0609 - val_mean_absolute_error: 0.1850
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.3855 - mean_absolute_error: 0.4807 - val_loss: 0.0569 - val_mean_absolute_error: 0.1768
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2887 - mean_absolute_error: 0.3695 - val_loss: 0.0519 - val_mean_absolute_error: 0.1653

FOLD 5
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 60ms/step - loss: 0.3700 - mean_absolute_error: 0.4620 - val_loss: 0.0182 - val_mean_absolute_error: 0.0633
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1699 - mean_absolute_error: 0.3247 - val_loss: 0.0186 - val_mean_absolute_error: 0.0636
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1544 - mean_absolute_error: 0.2945 - val_loss: 0.0266 - val_mean_absolute_error: 0.1114
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.8196 - mean_absolute_error: 0.5790 - val_loss: 0.0548 - val_mean_absolute_error: 0.1856
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1603 - mean_absolute_error: 0.3090 - val_loss: 0.0942 - val_mean_absolute_error: 0.2456
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.7584 - mean_absolute_error: 0.6585 - val_loss: 0.0969 - val_mean_absolute_error: 0.2338
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.2603 - mean_absolute_error: 0.4058 - val_loss: 0.0772 - val_mean_absolute_error: 0.2218
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2509 - mean_absolute_error: 0.3721 - val_loss: 0.0598 - val_mean_absolute_error: 0.2037
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.4459 - mean_absolute_error: 0.4649 - val_loss: 0.0431 - val_mean_absolute_error: 0.1691
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.2669 - mean_absolute_error: 0.4052 - val_loss: 0.0375 - val_mean_absolute_error: 0.1297
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 93ms/step - loss: 0.1340 - mean_absolute_error: 0.2928 - val_loss: 0.0430 - val_mean_absolute_error: 0.1211
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2153 - mean_absolute_error: 0.3678 - val_loss: 0.0516 - val_mean_absolute_error: 0.1665
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2700 - mean_absolute_error: 0.3986 - val_loss: 0.0541 - val_mean_absolute_error: 0.1799
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.2364 - mean_absolute_error: 0.3563 - val_loss: 0.0521 - val_mean_absolute_error: 0.1802
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2452 - mean_absolute_error: 0.3521 - val_loss: 0.0385 - val_mean_absolute_error: 0.1351
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.3730 - mean_absolute_error: 0.3631 - val_loss: 0.0314 - val_mean_absolute_error: 0.1025
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2346 - mean_absolute_error: 0.3452 - val_loss: 0.0276 - val_mean_absolute_error: 0.0897
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.1608 - mean_absolute_error: 0.2802 - val_loss: 0.0248 - val_mean_absolute_error: 0.0899
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.1898 - mean_absolute_error: 0.3189 - val_loss: 0.0225 - val_mean_absolute_error: 0.0875
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2078 - mean_absolute_error: 0.3360 - val_loss: 0.0194 - val_mean_absolute_error: 0.0729
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 97ms/step - loss: 0.1850 - mean_absolute_error: 0.3271 - val_loss: 0.0162 - val_mean_absolute_error: 0.0471
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.2021 - mean_absolute_error: 0.3521 - val_loss: 0.0148 - val_mean_absolute_error: 0.0340
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.2647 - mean_absolute_error: 0.4008 - val_loss: 0.0154 - val_mean_absolute_error: 0.0443
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.5301 - mean_absolute_error: 0.5554 - val_loss: 0.0175 - val_mean_absolute_error: 0.0559
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2165 - mean_absolute_error: 0.3308 - val_loss: 0.0233 - val_mean_absolute_error: 0.0974
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.4375 - mean_absolute_error: 0.3759 - val_loss: 0.0365 - val_mean_absolute_error: 0.1451
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.4964 - mean_absolute_error: 0.4473 - val_loss: 0.0582 - val_mean_absolute_error: 0.1903
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3122 - mean_absolute_error: 0.3817 - val_loss: 0.0906 - val_mean_absolute_error: 0.2351
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.4264 - mean_absolute_error: 0.4659 - val_loss: 0.1039 - val_mean_absolute_error: 0.2486
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3429 - mean_absolute_error: 0.4637 - val_loss: 0.0947 - val_mean_absolute_error: 0.2362
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.5056 - mean_absolute_error: 0.5295 - val_loss: 0.0512 - val_mean_absolute_error: 0.1698
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3116 - mean_absolute_error: 0.3555 - val_loss: 0.0195 - val_mean_absolute_error: 0.0774
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.2002 - mean_absolute_error: 0.3204 - val_loss: 0.0158 - val_mean_absolute_error: 0.0508
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1653 - mean_absolute_error: 0.3177 - val_loss: 0.0257 - val_mean_absolute_error: 0.1055
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.1890 - mean_absolute_error: 0.3719 - val_loss: 0.0326 - val_mean_absolute_error: 0.1245
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.2525 - mean_absolute_error: 0.2923 - val_loss: 0.0356 - val_mean_absolute_error: 0.1263
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.3350 - mean_absolute_error: 0.4479 - val_loss: 0.0277 - val_mean_absolute_error: 0.1025
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.2030 - mean_absolute_error: 0.3047 - val_loss: 0.0228 - val_mean_absolute_error: 0.0921
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2430 - mean_absolute_error: 0.4213 - val_loss: 0.0196 - val_mean_absolute_error: 0.0632
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.3082 - mean_absolute_error: 0.3827 - val_loss: 0.0288 - val_mean_absolute_error: 0.1103
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.0931 - mean_absolute_error: 0.2353 - val_loss: 0.0485 - val_mean_absolute_error: 0.1829
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2388 - mean_absolute_error: 0.3270 - val_loss: 0.0640 - val_mean_absolute_error: 0.2203
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1779 - mean_absolute_error: 0.3315 - val_loss: 0.0783 - val_mean_absolute_error: 0.2516
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.4888 - mean_absolute_error: 0.4695 - val_loss: 0.0677 - val_mean_absolute_error: 0.2259
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.2508 - mean_absolute_error: 0.4071 - val_loss: 0.0593 - val_mean_absolute_error: 0.2064
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.0921 - mean_absolute_error: 0.2188 - val_loss: 0.0523 - val_mean_absolute_error: 0.1873
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.1522 - mean_absolute_error: 0.2968 - val_loss: 0.0492 - val_mean_absolute_error: 0.1796
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2525 - mean_absolute_error: 0.3502 - val_loss: 0.0532 - val_mean_absolute_error: 0.1940
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.0834 - mean_absolute_error: 0.1857 - val_loss: 0.0599 - val_mean_absolute_error: 0.2135
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2329 - mean_absolute_error: 0.3331 - val_loss: 0.0657 - val_mean_absolute_error: 0.2279
Validation losses: [43.14220428466797, 1.1463972330093384, 0.13140827417373657, 0.05194620043039322, 0.06568381935358047]
HPS: {'player_emb_dim': 16, 'dense_units': 16, 'dense_units_2': 16, 'learning_rate': 0.01, 'dropout_rate': 0.1}. MSE during RandomSearch: 4.94112491607666. Starting evaluation across all k folds...

FOLD 1
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 2s 2s/step - loss: 4.0343 - mean_absolute_error: 1.5195 - val_loss: 34.5748 - val_mean_absolute_error: 4.5300
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 4.6776 - mean_absolute_error: 1.6418 - val_loss: 34.2430 - val_mean_absolute_error: 4.6070
Epoch 3/50
2025-08-09 14:13:51.362338: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:13:51.362682: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.6064 - mean_absolute_error: 1.4159 - val_loss: 34.3304 - val_mean_absolute_error: 4.5831
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.7327 - mean_absolute_error: 1.4605 - val_loss: 34.8301 - val_mean_absolute_error: 4.6008
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 3.0192 - mean_absolute_error: 1.2765 - val_loss: 34.5722 - val_mean_absolute_error: 4.5251
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.3246 - mean_absolute_error: 1.1198 - val_loss: 33.7093 - val_mean_absolute_error: 4.4383
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.3255 - mean_absolute_error: 1.1706 - val_loss: 34.1633 - val_mean_absolute_error: 4.5215
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.0835 - mean_absolute_error: 1.0324 - val_loss: 34.0920 - val_mean_absolute_error: 4.5600
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.7727 - mean_absolute_error: 0.9159 - val_loss: 33.8672 - val_mean_absolute_error: 4.5536
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.4825 - mean_absolute_error: 0.8215 - val_loss: 34.1052 - val_mean_absolute_error: 4.5949
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 1.1444 - mean_absolute_error: 0.8483 - val_loss: 34.4765 - val_mean_absolute_error: 4.6512
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.7901 - mean_absolute_error: 0.9070 - val_loss: 34.5058 - val_mean_absolute_error: 4.6737
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.7585 - mean_absolute_error: 0.5663 - val_loss: 34.9560 - val_mean_absolute_error: 4.6935
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 2.5854 - mean_absolute_error: 1.0550 - val_loss: 35.7507 - val_mean_absolute_error: 4.6790
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.5999 - mean_absolute_error: 0.6177 - val_loss: 39.0104 - val_mean_absolute_error: 4.8012
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.4319 - mean_absolute_error: 0.5551 - val_loss: 40.1821 - val_mean_absolute_error: 4.7171
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.6523 - mean_absolute_error: 0.6469 - val_loss: 38.8458 - val_mean_absolute_error: 4.5836
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.5325 - mean_absolute_error: 0.6326 - val_loss: 38.5362 - val_mean_absolute_error: 4.6403
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.4730 - mean_absolute_error: 0.8213 - val_loss: 39.9922 - val_mean_absolute_error: 4.7295
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.8555 - mean_absolute_error: 0.7087 - val_loss: 41.9838 - val_mean_absolute_error: 4.7994
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.3969 - mean_absolute_error: 0.5523 - val_loss: 42.2260 - val_mean_absolute_error: 4.8121
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.4181 - mean_absolute_error: 0.4480 - val_loss: 41.7108 - val_mean_absolute_error: 4.7607
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.1370 - mean_absolute_error: 0.2816 - val_loss: 41.2679 - val_mean_absolute_error: 4.7021
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.1225 - mean_absolute_error: 0.2613 - val_loss: 40.4945 - val_mean_absolute_error: 4.6511
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.4038 - mean_absolute_error: 0.4972 - val_loss: 39.1606 - val_mean_absolute_error: 4.5919
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.2014 - mean_absolute_error: 0.3524 - val_loss: 37.9310 - val_mean_absolute_error: 4.5247
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.2025 - mean_absolute_error: 0.3575 - val_loss: 37.3857 - val_mean_absolute_error: 4.5158
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.3574 - mean_absolute_error: 0.4329 - val_loss: 36.9474 - val_mean_absolute_error: 4.5136
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.4033 - mean_absolute_error: 0.4375 - val_loss: 36.9318 - val_mean_absolute_error: 4.5294
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.1754 - mean_absolute_error: 0.3573 - val_loss: 36.7513 - val_mean_absolute_error: 4.5562
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.4720 - mean_absolute_error: 0.5317 - val_loss: 35.8885 - val_mean_absolute_error: 4.5203
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.4400 - mean_absolute_error: 0.4954 - val_loss: 34.9874 - val_mean_absolute_error: 4.4781
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2645 - mean_absolute_error: 0.3894 - val_loss: 34.6038 - val_mean_absolute_error: 4.4509
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.3009 - mean_absolute_error: 0.4186 - val_loss: 34.7222 - val_mean_absolute_error: 4.4401
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.0636 - mean_absolute_error: 0.1921 - val_loss: 35.1653 - val_mean_absolute_error: 4.4469
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.3838 - mean_absolute_error: 0.4190 - val_loss: 35.1309 - val_mean_absolute_error: 4.4078
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2333 - mean_absolute_error: 0.3497 - val_loss: 35.0374 - val_mean_absolute_error: 4.3689
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.3108 - mean_absolute_error: 0.3760 - val_loss: 34.5867 - val_mean_absolute_error: 4.3311
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.0885 - mean_absolute_error: 0.2303 - val_loss: 34.2415 - val_mean_absolute_error: 4.3246
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.0715 - mean_absolute_error: 0.1848 - val_loss: 33.8127 - val_mean_absolute_error: 4.3065
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.1793 - mean_absolute_error: 0.2984 - val_loss: 33.4283 - val_mean_absolute_error: 4.3029
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.1483 - mean_absolute_error: 0.2933 - val_loss: 33.3846 - val_mean_absolute_error: 4.3162
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.1665 - mean_absolute_error: 0.2697 - val_loss: 33.6733 - val_mean_absolute_error: 4.3514
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2647 - mean_absolute_error: 0.3293 - val_loss: 34.3621 - val_mean_absolute_error: 4.4080
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.5703 - mean_absolute_error: 0.4875 - val_loss: 34.9946 - val_mean_absolute_error: 4.4418
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.1416 - mean_absolute_error: 0.2734 - val_loss: 35.4902 - val_mean_absolute_error: 4.4649
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2770 - mean_absolute_error: 0.3030 - val_loss: 35.9263 - val_mean_absolute_error: 4.4827
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.2774 - mean_absolute_error: 0.3506 - val_loss: 35.7171 - val_mean_absolute_error: 4.4639
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.2254 - mean_absolute_error: 0.2893 - val_loss: 35.2870 - val_mean_absolute_error: 4.4177
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.1913 - mean_absolute_error: 0.3377 - val_loss: 34.7215 - val_mean_absolute_error: 4.3703

FOLD 2
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 57ms/step - loss: 9.3987 - mean_absolute_error: 1.3628 - val_loss: 0.2008 - val_mean_absolute_error: 0.4126
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 7.9613 - mean_absolute_error: 1.4467 - val_loss: 0.8913 - val_mean_absolute_error: 0.8693
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 6.9507 - mean_absolute_error: 1.7332 - val_loss: 2.2518 - val_mean_absolute_error: 1.3731
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 7.0944 - mean_absolute_error: 1.8383 - val_loss: 2.6121 - val_mean_absolute_error: 1.4617
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 7.3635 - mean_absolute_error: 1.9924 - val_loss: 1.6760 - val_mean_absolute_error: 1.1251
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.7958 - mean_absolute_error: 1.1830 - val_loss: 1.3243 - val_mean_absolute_error: 1.0073
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 4.5320 - mean_absolute_error: 1.4333 - val_loss: 1.7834 - val_mean_absolute_error: 1.0813
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.1511 - mean_absolute_error: 1.1202 - val_loss: 3.0768 - val_mean_absolute_error: 1.1977
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 3.1591 - mean_absolute_error: 1.2097 - val_loss: 5.9766 - val_mean_absolute_error: 1.5054
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.9071 - mean_absolute_error: 0.7555 - val_loss: 8.8938 - val_mean_absolute_error: 1.7214
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 2.1186 - mean_absolute_error: 0.9419 - val_loss: 12.7038 - val_mean_absolute_error: 2.0268
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 2.8869 - mean_absolute_error: 1.1092 - val_loss: 12.8374 - val_mean_absolute_error: 2.0295
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 2.5674 - mean_absolute_error: 1.1108 - val_loss: 9.6760 - val_mean_absolute_error: 1.7695
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 2.0677 - mean_absolute_error: 1.0681 - val_loss: 7.4774 - val_mean_absolute_error: 1.5768
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 2.1436 - mean_absolute_error: 1.0842 - val_loss: 5.6149 - val_mean_absolute_error: 1.4044
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 5.4829 - mean_absolute_error: 1.5683 - val_loss: 5.9808 - val_mean_absolute_error: 1.4791
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 10.9233 - mean_absolute_error: 1.7021 - val_loss: 7.1460 - val_mean_absolute_error: 1.7247
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.0118 - mean_absolute_error: 0.7538 - val_loss: 7.9005 - val_mean_absolute_error: 2.0616
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.0441 - mean_absolute_error: 0.7445 - val_loss: 8.1550 - val_mean_absolute_error: 2.2608
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.9952 - mean_absolute_error: 0.7543 - val_loss: 7.5709 - val_mean_absolute_error: 2.2653
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 1.2463 - mean_absolute_error: 0.8103 - val_loss: 6.8451 - val_mean_absolute_error: 2.1961
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 1.1220 - mean_absolute_error: 0.7419 - val_loss: 6.0177 - val_mean_absolute_error: 2.0964
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.8875 - mean_absolute_error: 1.1840 - val_loss: 5.0896 - val_mean_absolute_error: 1.9458
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.9601 - mean_absolute_error: 0.8125 - val_loss: 5.0741 - val_mean_absolute_error: 1.9779
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 1.9572 - mean_absolute_error: 0.9499 - val_loss: 6.0449 - val_mean_absolute_error: 2.2211
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 1.6378 - mean_absolute_error: 0.9849 - val_loss: 8.3581 - val_mean_absolute_error: 2.6532
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 4.0374 - mean_absolute_error: 1.0871 - val_loss: 9.5814 - val_mean_absolute_error: 2.8028
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.4568 - mean_absolute_error: 1.3595 - val_loss: 9.5952 - val_mean_absolute_error: 2.6927
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.4577 - mean_absolute_error: 0.8991 - val_loss: 7.6048 - val_mean_absolute_error: 2.3184
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 1.3748 - mean_absolute_error: 0.9457 - val_loss: 5.6997 - val_mean_absolute_error: 1.9317
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.6383 - mean_absolute_error: 0.9472 - val_loss: 5.6606 - val_mean_absolute_error: 1.8535
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 1.5045 - mean_absolute_error: 0.8881 - val_loss: 6.1654 - val_mean_absolute_error: 1.8923
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 2.2297 - mean_absolute_error: 1.1895 - val_loss: 6.5345 - val_mean_absolute_error: 1.9612
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.5344 - mean_absolute_error: 0.8714 - val_loss: 6.3317 - val_mean_absolute_error: 2.0015
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 6.0636 - mean_absolute_error: 1.5740 - val_loss: 8.3547 - val_mean_absolute_error: 2.3397
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.1412 - mean_absolute_error: 0.8117 - val_loss: 9.1636 - val_mean_absolute_error: 2.4785
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.6650 - mean_absolute_error: 1.0103 - val_loss: 8.8215 - val_mean_absolute_error: 2.4549
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.8324 - mean_absolute_error: 0.6808 - val_loss: 7.8459 - val_mean_absolute_error: 2.3562
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.2305 - mean_absolute_error: 0.8735 - val_loss: 7.8362 - val_mean_absolute_error: 2.3661
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 1.4262 - mean_absolute_error: 0.8385 - val_loss: 8.5760 - val_mean_absolute_error: 2.4826
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.0398 - mean_absolute_error: 1.0733 - val_loss: 9.9107 - val_mean_absolute_error: 2.6462
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 1.0659 - mean_absolute_error: 0.6726 - val_loss: 11.0645 - val_mean_absolute_error: 2.7565
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.5220 - mean_absolute_error: 0.8946 - val_loss: 11.9100 - val_mean_absolute_error: 2.7773
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 85ms/step - loss: 0.6198 - mean_absolute_error: 0.5489 - val_loss: 11.9684 - val_mean_absolute_error: 2.7346
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.8757 - mean_absolute_error: 0.6442 - val_loss: 13.5691 - val_mean_absolute_error: 2.8530
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.5499 - mean_absolute_error: 0.5555 - val_loss: 16.1820 - val_mean_absolute_error: 3.0504
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 1.6876 - mean_absolute_error: 0.6205 - val_loss: 14.7106 - val_mean_absolute_error: 2.8769
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.3614 - mean_absolute_error: 0.4583 - val_loss: 13.4683 - val_mean_absolute_error: 2.7130
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.9028 - mean_absolute_error: 0.6830 - val_loss: 11.4657 - val_mean_absolute_error: 2.5090
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.8401 - mean_absolute_error: 0.9491 - val_loss: 11.0744 - val_mean_absolute_error: 2.4963

FOLD 3
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 57ms/step - loss: 6.1889 - mean_absolute_error: 1.4318 - val_loss: 0.0419 - val_mean_absolute_error: 0.1479
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.5602 - mean_absolute_error: 0.8609 - val_loss: 0.0819 - val_mean_absolute_error: 0.2159
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.3562 - mean_absolute_error: 0.8656 - val_loss: 0.1008 - val_mean_absolute_error: 0.2476
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 1.8605 - mean_absolute_error: 1.0340 - val_loss: 0.0953 - val_mean_absolute_error: 0.2362
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.4963 - mean_absolute_error: 0.8928 - val_loss: 0.0774 - val_mean_absolute_error: 0.2451
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 2.7973 - mean_absolute_error: 1.1173 - val_loss: 0.0635 - val_mean_absolute_error: 0.2278
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 2.4194 - mean_absolute_error: 1.0958 - val_loss: 0.1197 - val_mean_absolute_error: 0.3007
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.8710 - mean_absolute_error: 0.9809 - val_loss: 0.2222 - val_mean_absolute_error: 0.4004
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 33ms/step - loss: 0.4064 - mean_absolute_error: 0.5196 - val_loss: 0.3885 - val_mean_absolute_error: 0.4979
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.6095 - mean_absolute_error: 0.6410 - val_loss: 0.5360 - val_mean_absolute_error: 0.5826
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.6886 - mean_absolute_error: 0.7018 - val_loss: 0.6480 - val_mean_absolute_error: 0.6541
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.4441 - mean_absolute_error: 1.1941 - val_loss: 0.5474 - val_mean_absolute_error: 0.6379
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.3384 - mean_absolute_error: 1.0389 - val_loss: 0.4202 - val_mean_absolute_error: 0.5941
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.4947 - mean_absolute_error: 0.5476 - val_loss: 0.3936 - val_mean_absolute_error: 0.5478
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.8419 - mean_absolute_error: 0.7860 - val_loss: 0.4734 - val_mean_absolute_error: 0.6132
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.6889 - mean_absolute_error: 0.7187 - val_loss: 0.5087 - val_mean_absolute_error: 0.6374
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.4665 - mean_absolute_error: 0.5955 - val_loss: 0.4687 - val_mean_absolute_error: 0.5908
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 2.1486 - mean_absolute_error: 0.9328 - val_loss: 0.3887 - val_mean_absolute_error: 0.4932
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.0320 - mean_absolute_error: 0.7033 - val_loss: 0.3326 - val_mean_absolute_error: 0.4429
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.5469 - mean_absolute_error: 0.5473 - val_loss: 0.2965 - val_mean_absolute_error: 0.4222
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.6366 - mean_absolute_error: 0.9805 - val_loss: 0.2751 - val_mean_absolute_error: 0.4197
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.8002 - mean_absolute_error: 0.7053 - val_loss: 0.2736 - val_mean_absolute_error: 0.4510
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.0495 - mean_absolute_error: 0.7276 - val_loss: 0.3007 - val_mean_absolute_error: 0.4901
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.0005 - mean_absolute_error: 0.7644 - val_loss: 0.3005 - val_mean_absolute_error: 0.4924
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.2476 - mean_absolute_error: 0.5681 - val_loss: 0.3102 - val_mean_absolute_error: 0.4913
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.4096 - mean_absolute_error: 0.4811 - val_loss: 0.3277 - val_mean_absolute_error: 0.4900
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 2.9020 - mean_absolute_error: 0.8927 - val_loss: 0.3322 - val_mean_absolute_error: 0.4746
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.1141 - mean_absolute_error: 0.7580 - val_loss: 0.3450 - val_mean_absolute_error: 0.5214
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.4297 - mean_absolute_error: 0.5040 - val_loss: 0.4016 - val_mean_absolute_error: 0.5762
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.6803 - mean_absolute_error: 0.6848 - val_loss: 0.4725 - val_mean_absolute_error: 0.6207
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.8523 - mean_absolute_error: 0.7195 - val_loss: 0.4947 - val_mean_absolute_error: 0.6233
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.7440 - mean_absolute_error: 0.6526 - val_loss: 0.4893 - val_mean_absolute_error: 0.6050
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.3860 - mean_absolute_error: 0.4677 - val_loss: 0.4800 - val_mean_absolute_error: 0.5770
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.7277 - mean_absolute_error: 0.5740 - val_loss: 0.4787 - val_mean_absolute_error: 0.5598
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.6215 - mean_absolute_error: 0.5615 - val_loss: 0.4666 - val_mean_absolute_error: 0.5388
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.9922 - mean_absolute_error: 0.5792 - val_loss: 0.4508 - val_mean_absolute_error: 0.5161
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 1.1668 - mean_absolute_error: 0.7063 - val_loss: 0.4536 - val_mean_absolute_error: 0.5137
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.3497 - mean_absolute_error: 0.4765 - val_loss: 0.4516 - val_mean_absolute_error: 0.5151
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.2098 - mean_absolute_error: 0.6875 - val_loss: 0.4483 - val_mean_absolute_error: 0.5108
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.1162 - mean_absolute_error: 0.6661 - val_loss: 0.4362 - val_mean_absolute_error: 0.4953
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.7767 - mean_absolute_error: 0.5638 - val_loss: 0.4299 - val_mean_absolute_error: 0.4957
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.7016 - mean_absolute_error: 0.5464 - val_loss: 0.4244 - val_mean_absolute_error: 0.5013
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.6307 - mean_absolute_error: 0.5390 - val_loss: 0.4142 - val_mean_absolute_error: 0.5129
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.4437 - mean_absolute_error: 0.4819 - val_loss: 0.4026 - val_mean_absolute_error: 0.5168
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.9097 - mean_absolute_error: 0.7355 - val_loss: 0.4054 - val_mean_absolute_error: 0.5191
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 76ms/step - loss: 1.2659 - mean_absolute_error: 0.7921 - val_loss: 0.3929 - val_mean_absolute_error: 0.5047
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.7816 - mean_absolute_error: 0.5826 - val_loss: 0.3988 - val_mean_absolute_error: 0.5061
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.1716 - mean_absolute_error: 0.3197 - val_loss: 0.4000 - val_mean_absolute_error: 0.5137
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.3339 - mean_absolute_error: 0.4295 - val_loss: 0.3878 - val_mean_absolute_error: 0.5122
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.0624 - mean_absolute_error: 0.7461 - val_loss: 0.4032 - val_mean_absolute_error: 0.5225

FOLD 4
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 58ms/step - loss: 0.3535 - mean_absolute_error: 0.4698 - val_loss: 0.0787 - val_mean_absolute_error: 0.2363
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.0079 - mean_absolute_error: 0.6231 - val_loss: 0.0816 - val_mean_absolute_error: 0.2379
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.9623 - mean_absolute_error: 0.7743 - val_loss: 0.0961 - val_mean_absolute_error: 0.2611
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.7829 - mean_absolute_error: 0.6670 - val_loss: 0.1232 - val_mean_absolute_error: 0.2768
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.4026 - mean_absolute_error: 0.4356 - val_loss: 0.1463 - val_mean_absolute_error: 0.2863
2025-08-09 14:13:57.162023: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 14:13:57.162374: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.2176 - mean_absolute_error: 0.3950 - val_loss: 0.1842 - val_mean_absolute_error: 0.3173
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 1.9379 - mean_absolute_error: 0.7969 - val_loss: 0.2230 - val_mean_absolute_error: 0.3716
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.7647 - mean_absolute_error: 0.6009 - val_loss: 0.2711 - val_mean_absolute_error: 0.4330
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.8095 - mean_absolute_error: 0.6736 - val_loss: 0.3526 - val_mean_absolute_error: 0.5028
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.8088 - mean_absolute_error: 0.7159 - val_loss: 0.4592 - val_mean_absolute_error: 0.5587
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.9353 - mean_absolute_error: 0.6894 - val_loss: 0.5597 - val_mean_absolute_error: 0.5976
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.4756 - mean_absolute_error: 0.5385 - val_loss: 0.7025 - val_mean_absolute_error: 0.6467
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.5941 - mean_absolute_error: 0.6021 - val_loss: 0.6776 - val_mean_absolute_error: 0.6493
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.6664 - mean_absolute_error: 0.4513 - val_loss: 0.6212 - val_mean_absolute_error: 0.6377
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.6853 - mean_absolute_error: 0.5620 - val_loss: 0.4873 - val_mean_absolute_error: 0.5886
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.3521 - mean_absolute_error: 0.4726 - val_loss: 0.4135 - val_mean_absolute_error: 0.5555
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.5863 - mean_absolute_error: 0.5790 - val_loss: 0.3785 - val_mean_absolute_error: 0.5356
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.5874 - mean_absolute_error: 0.5918 - val_loss: 0.3593 - val_mean_absolute_error: 0.5248
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.3099 - mean_absolute_error: 0.4479 - val_loss: 0.3586 - val_mean_absolute_error: 0.5318
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.4984 - mean_absolute_error: 0.5688 - val_loss: 0.3443 - val_mean_absolute_error: 0.5173
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.2926 - mean_absolute_error: 0.3705 - val_loss: 0.3356 - val_mean_absolute_error: 0.5002
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.2789 - mean_absolute_error: 0.4112 - val_loss: 0.3257 - val_mean_absolute_error: 0.4832
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.5932 - mean_absolute_error: 0.4213 - val_loss: 0.3010 - val_mean_absolute_error: 0.4592
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.3323 - mean_absolute_error: 0.4426 - val_loss: 0.2666 - val_mean_absolute_error: 0.4216
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.2411 - mean_absolute_error: 0.3668 - val_loss: 0.2501 - val_mean_absolute_error: 0.4015
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 33ms/step - loss: 0.6513 - mean_absolute_error: 0.5631 - val_loss: 0.2582 - val_mean_absolute_error: 0.4014
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.9561 - mean_absolute_error: 0.5045 - val_loss: 0.2686 - val_mean_absolute_error: 0.4031
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.4659 - mean_absolute_error: 0.5063 - val_loss: 0.3015 - val_mean_absolute_error: 0.4339
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.8378 - mean_absolute_error: 0.6205 - val_loss: 0.3683 - val_mean_absolute_error: 0.4756
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.2513 - mean_absolute_error: 0.3461 - val_loss: 0.4139 - val_mean_absolute_error: 0.5069
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.4821 - mean_absolute_error: 0.5433 - val_loss: 0.4042 - val_mean_absolute_error: 0.5059
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.2680 - mean_absolute_error: 0.6829 - val_loss: 0.3585 - val_mean_absolute_error: 0.4861
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.4371 - mean_absolute_error: 0.4534 - val_loss: 0.3053 - val_mean_absolute_error: 0.4600
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.4497 - mean_absolute_error: 0.5155 - val_loss: 0.2603 - val_mean_absolute_error: 0.4254
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 1.4706 - mean_absolute_error: 0.7474 - val_loss: 0.1999 - val_mean_absolute_error: 0.3703
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.1337 - mean_absolute_error: 0.3079 - val_loss: 0.1519 - val_mean_absolute_error: 0.3168
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.3357 - mean_absolute_error: 0.4243 - val_loss: 0.1454 - val_mean_absolute_error: 0.3100
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.7767 - mean_absolute_error: 0.5726 - val_loss: 0.1876 - val_mean_absolute_error: 0.3541
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 2.3186 - mean_absolute_error: 0.8938 - val_loss: 0.2616 - val_mean_absolute_error: 0.4068
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.2215 - mean_absolute_error: 0.3903 - val_loss: 0.3913 - val_mean_absolute_error: 0.4826
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.3454 - mean_absolute_error: 0.3888 - val_loss: 0.5258 - val_mean_absolute_error: 0.5532
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 5.0028 - mean_absolute_error: 0.9007 - val_loss: 0.6469 - val_mean_absolute_error: 0.5997
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.5470 - mean_absolute_error: 0.5432 - val_loss: 0.7248 - val_mean_absolute_error: 0.6458
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.3886 - mean_absolute_error: 0.4661 - val_loss: 0.7876 - val_mean_absolute_error: 0.6922
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.3245 - mean_absolute_error: 0.3749 - val_loss: 0.8302 - val_mean_absolute_error: 0.7335
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.6988 - mean_absolute_error: 0.5292 - val_loss: 0.8712 - val_mean_absolute_error: 0.7637
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.5310 - mean_absolute_error: 0.5462 - val_loss: 0.8340 - val_mean_absolute_error: 0.7584
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.5350 - mean_absolute_error: 0.5522 - val_loss: 0.7076 - val_mean_absolute_error: 0.7087
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.7741 - mean_absolute_error: 0.6186 - val_loss: 0.5773 - val_mean_absolute_error: 0.6468
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.1386 - mean_absolute_error: 0.6674 - val_loss: 0.3939 - val_mean_absolute_error: 0.5490

FOLD 5
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 56ms/step - loss: 0.3168 - mean_absolute_error: 0.4253 - val_loss: 0.1104 - val_mean_absolute_error: 0.3213
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.1923 - mean_absolute_error: 0.3603 - val_loss: 0.2251 - val_mean_absolute_error: 0.4290
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.1016 - mean_absolute_error: 0.2322 - val_loss: 0.3864 - val_mean_absolute_error: 0.5228
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.2913 - mean_absolute_error: 0.3744 - val_loss: 0.5086 - val_mean_absolute_error: 0.5539
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.2009 - mean_absolute_error: 0.3350 - val_loss: 0.6042 - val_mean_absolute_error: 0.5612
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.7690 - mean_absolute_error: 0.5844 - val_loss: 0.6991 - val_mean_absolute_error: 0.5929
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 0.5139 - mean_absolute_error: 0.5170 - val_loss: 0.7317 - val_mean_absolute_error: 0.6210
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.9666 - mean_absolute_error: 0.5685 - val_loss: 0.7306 - val_mean_absolute_error: 0.6323
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.4456 - mean_absolute_error: 0.5233 - val_loss: 0.6934 - val_mean_absolute_error: 0.6333
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.3661 - mean_absolute_error: 0.3975 - val_loss: 0.6180 - val_mean_absolute_error: 0.6124
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.3610 - mean_absolute_error: 0.4217 - val_loss: 0.5589 - val_mean_absolute_error: 0.5780
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3067 - mean_absolute_error: 0.4320 - val_loss: 0.4869 - val_mean_absolute_error: 0.5253
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.4169 - mean_absolute_error: 0.4335 - val_loss: 0.4206 - val_mean_absolute_error: 0.5088
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.3550 - mean_absolute_error: 0.4718 - val_loss: 0.3393 - val_mean_absolute_error: 0.4380
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.3249 - mean_absolute_error: 0.3763 - val_loss: 0.2676 - val_mean_absolute_error: 0.3751
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.9689 - mean_absolute_error: 0.7883 - val_loss: 0.2596 - val_mean_absolute_error: 0.3670
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.8241 - mean_absolute_error: 0.6396 - val_loss: 0.2869 - val_mean_absolute_error: 0.3955
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.4214 - mean_absolute_error: 0.4561 - val_loss: 0.3660 - val_mean_absolute_error: 0.4625
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 89ms/step - loss: 0.6952 - mean_absolute_error: 0.5933 - val_loss: 0.4248 - val_mean_absolute_error: 0.4943
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.3019 - mean_absolute_error: 0.3484 - val_loss: 0.4697 - val_mean_absolute_error: 0.5197
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 59ms/step - loss: 0.4646 - mean_absolute_error: 0.4982 - val_loss: 0.5139 - val_mean_absolute_error: 0.5052
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.6984 - mean_absolute_error: 0.5126 - val_loss: 0.5446 - val_mean_absolute_error: 0.5012
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1116 - mean_absolute_error: 0.2731 - val_loss: 0.6101 - val_mean_absolute_error: 0.5124
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1596 - mean_absolute_error: 0.3004 - val_loss: 0.6731 - val_mean_absolute_error: 0.5252
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.4716 - mean_absolute_error: 0.3955 - val_loss: 0.6930 - val_mean_absolute_error: 0.5271
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.9483 - mean_absolute_error: 0.5120 - val_loss: 0.6812 - val_mean_absolute_error: 0.5294
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.0866 - mean_absolute_error: 0.2056 - val_loss: 0.6671 - val_mean_absolute_error: 0.5307
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2154 - mean_absolute_error: 0.3155 - val_loss: 0.6672 - val_mean_absolute_error: 0.5422
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.4104 - mean_absolute_error: 0.4872 - val_loss: 0.6686 - val_mean_absolute_error: 0.5653
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.3290 - mean_absolute_error: 0.3475 - val_loss: 0.6946 - val_mean_absolute_error: 0.5967
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.4352 - mean_absolute_error: 0.5152 - val_loss: 0.7237 - val_mean_absolute_error: 0.6268
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.9695 - mean_absolute_error: 0.4908 - val_loss: 0.7759 - val_mean_absolute_error: 0.6642
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.2504 - mean_absolute_error: 0.3581 - val_loss: 0.8272 - val_mean_absolute_error: 0.6896
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.3299 - mean_absolute_error: 0.3755 - val_loss: 0.8402 - val_mean_absolute_error: 0.6821
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.5209 - mean_absolute_error: 0.4695 - val_loss: 0.8115 - val_mean_absolute_error: 0.6564
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.6560 - mean_absolute_error: 0.4385 - val_loss: 0.7619 - val_mean_absolute_error: 0.6193
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.3844 - mean_absolute_error: 0.3693 - val_loss: 0.7414 - val_mean_absolute_error: 0.5856
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3411 - mean_absolute_error: 0.4462 - val_loss: 0.7157 - val_mean_absolute_error: 0.5324
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3386 - mean_absolute_error: 0.3816 - val_loss: 0.7440 - val_mean_absolute_error: 0.6108
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1448 - mean_absolute_error: 0.3061 - val_loss: 0.8061 - val_mean_absolute_error: 0.6626
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.1020 - mean_absolute_error: 0.5147 - val_loss: 0.8602 - val_mean_absolute_error: 0.6864
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.6186 - mean_absolute_error: 0.5471 - val_loss: 0.8779 - val_mean_absolute_error: 0.6962
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.0266 - mean_absolute_error: 0.6168 - val_loss: 0.8630 - val_mean_absolute_error: 0.6733
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.5814 - mean_absolute_error: 0.5000 - val_loss: 0.8283 - val_mean_absolute_error: 0.6288
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.3537 - mean_absolute_error: 0.4186 - val_loss: 0.7936 - val_mean_absolute_error: 0.5901
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2238 - mean_absolute_error: 0.3482 - val_loss: 0.7805 - val_mean_absolute_error: 0.6317
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.8129 - mean_absolute_error: 0.4668 - val_loss: 0.8089 - val_mean_absolute_error: 0.6794
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1492 - mean_absolute_error: 0.2901 - val_loss: 0.8633 - val_mean_absolute_error: 0.7235
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1964 - mean_absolute_error: 0.3026 - val_loss: 0.9202 - val_mean_absolute_error: 0.7596
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2086 - mean_absolute_error: 0.3019 - val_loss: 0.9815 - val_mean_absolute_error: 0.7898
Validation losses: [34.721492767333984, 11.074376106262207, 0.4031953811645508, 0.39390820264816284, 0.9814999103546143]
HPS: {'player_emb_dim': 32, 'dense_units': 112, 'dense_units_2': 80, 'learning_rate': 0.01, 'dropout_rate': 0.1} Avg. across folds score(MSE): 7.982948251068592
HPS: {'player_emb_dim': 32, 'dense_units': 32, 'dense_units_2': 48, 'learning_rate': 0.01, 'dropout_rate': 0.1} Avg. across folds score(MSE): 9.579944050312042
HPS: {'player_emb_dim': 32, 'dense_units': 32, 'dense_units_2': 80, 'learning_rate': 0.0001, 'dropout_rate': 0.1} Avg. across folds score(MSE): 8.578448712825775
HPS: {'player_emb_dim': 8, 'dense_units': 96, 'dense_units_2': 80, 'learning_rate': 0.001, 'dropout_rate': 0.1} Avg. across folds score(MSE): 8.907527962327004
HPS: {'player_emb_dim': 16, 'dense_units': 16, 'dense_units_2': 16, 'learning_rate': 0.01, 'dropout_rate': 0.1} Avg. across folds score(MSE): 9.514894473552705
HPS: {'player_emb_dim': 32, 'dense_units': 32, 'dense_units_2': 48, 'learning_rate': 0.01, 'dropout_rate': 0.1}. Avg MSE: 9.579944050312042.
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 2s 2s/step - loss: 11.9505 - mean_absolute_error: 2.1858
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 9.6423 - mean_absolute_error: 2.0051
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 9.1238 - mean_absolute_error: 2.1179
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 9.0570 - mean_absolute_error: 2.1612
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 9.0271 - mean_absolute_error: 2.2973
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 7.9236 - mean_absolute_error: 2.0939
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 7.4294 - mean_absolute_error: 2.0797
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 6.1770 - mean_absolute_error: 1.7244
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 5.8669 - mean_absolute_error: 1.6838
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 4.2827 - mean_absolute_error: 1.6336
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 3.3800 - mean_absolute_error: 1.2941
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 3.0366 - mean_absolute_error: 1.3971
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 2.6068 - mean_absolute_error: 1.2833
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 4.8652 - mean_absolute_error: 1.2618
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 1.5067 - mean_absolute_error: 0.9554
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 1.8590 - mean_absolute_error: 1.0745
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 2.7079 - mean_absolute_error: 1.2477
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 1.9600 - mean_absolute_error: 1.1798
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 2.5502 - mean_absolute_error: 1.1779
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 1.1651 - mean_absolute_error: 0.8854
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 2.6245 - mean_absolute_error: 1.1322
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 1.5212 - mean_absolute_error: 0.9605
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 2.5804 - mean_absolute_error: 1.1179
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 4.1401 - mean_absolute_error: 1.3449
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 1.6958 - mean_absolute_error: 0.9940
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 1.9259 - mean_absolute_error: 0.9300
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 1.5679 - mean_absolute_error: 0.9790
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 1.7502 - mean_absolute_error: 0.9393
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 1.2199 - mean_absolute_error: 0.8123
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 1.7099 - mean_absolute_error: 0.9314
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 1.2263 - mean_absolute_error: 0.8257
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 3.4807 - mean_absolute_error: 1.0309
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 1.2159 - mean_absolute_error: 0.8007
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 1.8262 - mean_absolute_error: 0.8713
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 1.1835 - mean_absolute_error: 0.8062
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 1.1094 - mean_absolute_error: 0.7264
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 0.9964 - mean_absolute_error: 0.7448
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 1.5369 - mean_absolute_error: 0.8390
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 0.6442 - mean_absolute_error: 0.5962
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 2.3622 - mean_absolute_error: 0.9431
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 1.9894 - mean_absolute_error: 1.0159
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 1.5817 - mean_absolute_error: 0.9003
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 1.2069 - mean_absolute_error: 0.8351
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.4634 - mean_absolute_error: 0.5759
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 1.7489 - mean_absolute_error: 0.8830
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.5826 - mean_absolute_error: 0.5130
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.8896 - mean_absolute_error: 0.6669
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 0.8476 - mean_absolute_error: 0.7068
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - loss: 1.6368 - mean_absolute_error: 0.8432
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - loss: 1.3821 - mean_absolute_error: 0.7463

Visualize real players#

marker_labels = [f'{name}' for name in team_members_with_ids.values()]
fig, _, _ = analyze_players_embeddings(model_real, player_strengths_estimates, marker_labels)
fig
player_strengths.shape: (31,)
embeddings_nd[:, 0].shape : (31,)
Embeddings shape: (31, 16)
Dimension 1 correlation with base strengths: r = 0.2366, p-value = 0.2
Dimension 2 correlation with base strengths: r = -0.0082, p-value = 0.9651
Dimension 3 correlation with base strengths: r = -0.1194, p-value = 0.5223
Dimension 4 correlation with base strengths: r = 0.0379, p-value = 0.8394
Dimension 5 correlation with base strengths: r = 0.0772, p-value = 0.6798
Dimension 6 correlation with base strengths: r = 0.1129, p-value = 0.5453
Dimension 7 correlation with base strengths: r = -0.0807, p-value = 0.6659
Dimension 8 correlation with base strengths: r = -0.0719, p-value = 0.7008
Dimension 9 correlation with base strengths: r = -0.0092, p-value = 0.9606
Dimension 10 correlation with base strengths: r = 0.0638, p-value = 0.7331
Dimension 11 correlation with base strengths: r = -0.0245, p-value = 0.896
Dimension 12 correlation with base strengths: r = 0.0135, p-value = 0.9425
Dimension 13 correlation with base strengths: r = 0.0525, p-value = 0.779
Dimension 14 correlation with base strengths: r = 0.1772, p-value = 0.3404
Dimension 15 correlation with base strengths: r = 0.1700, p-value = 0.3605
Dimension 16 correlation with base strengths: r = -0.0344, p-value = 0.8542
Average absolute correlation across 16 components: 0.0806
player_strengths.shape: (31,)
embeddings_nd[:, 0].shape : (31,)
Embeddings shape: (31, 3)
Dimension 1 correlation with base strengths: r = -0.0060, p-value = 0.9743
Dimension 2 correlation with base strengths: r = 0.0176, p-value = 0.925
Dimension 3 correlation with base strengths: r = -0.0591, p-value = 0.7523
Average absolute correlation across 3 components: 0.0276

Using version with interaction for real data#

from tensorflow.keras.callbacks import EarlyStopping

early_stop = EarlyStopping(monitor='val_loss', patience=5, restore_best_weights=True)

# Define a learning rate schedule function (step decay example)
def lr_schedule(epoch, lr):
    drop_rate = 0.5
    epochs_drop = 10
    if epoch > 0 and epoch % epochs_drop == 0:
        return lr * drop_rate
    return lr

#Instantiate callback
lr_scheduler = tf.keras.callbacks.LearningRateScheduler(lr_schedule)

# Or adaptive reduction on plateau (reduce LR when val_loss stalls)
reduce_lr = tf.keras.callbacks.ReduceLROnPlateau(monitor='val_loss', factor=0.5, patience=5, min_lr=1e-6)

es_callbacks=[lr_scheduler, reduce_lr, early_stop]


all_best_hps_inter_real = hyperparameter_search(build_model_inter, max_trials=10, callbacks=es_callbacks)
teamA_data shape: (19, 9)
teamB_data shape: (19, 9)
outcomes shape: (19,)

FOLD 1
  player_emb_dim: 32
  dense_units: 16
  dense_units_2: 16
  learning_rate: 0.01
  dropout_rate: 0.1
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.1
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
/opt/anaconda3/envs/footballman/lib/python3.12/site-packages/keras/src/layers/layer.py:939: UserWarning:

Layer 'teamA_pairwise' (of type Lambda) was passed an input with a mask attached to it. However, this layer does not support masking and will therefore destroy the mask information. Downstream layers will not see the mask.

/opt/anaconda3/envs/footballman/lib/python3.12/site-packages/keras/src/layers/layer.py:939: UserWarning:

Layer 'teamB_pairwise' (of type Lambda) was passed an input with a mask attached to it. However, this layer does not support masking and will therefore destroy the mask information. Downstream layers will not see the mask.
  player_emb_dim: 32
  dense_units: 64
  dense_units_2: 128
  learning_rate: 0.001
  dropout_rate: 0.30000000000000004
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
2025-08-09 16:31:31.336983: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:31:31.337346: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 112
  dense_units_2: 48
  learning_rate: 0.001
  dropout_rate: 0.4
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
2025-08-09 16:31:38.262740: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:31:38.263531: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 16
  dense_units_2: 128
  learning_rate: 0.01
  dropout_rate: 0.1
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.1
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 64
  dense_units_2: 128
  learning_rate: 0.01
  dropout_rate: 0.4
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.2
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
2025-08-09 16:31:45.258772: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:31:45.260569: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 80
  dense_units_2: 96
  learning_rate: 0.01
  dropout_rate: 0.30000000000000004
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
2025-08-09 16:31:51.153111: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:31:51.153478: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 80
  dense_units_2: 64
  learning_rate: 0.001
  dropout_rate: 0.30000000000000004
  dropout_rate_2: 0.4
  dropout_rate_inter: 0.1
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 48
  dense_units_2: 96
  learning_rate: 0.001
  dropout_rate: 0.2
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
2025-08-09 16:31:57.206178: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:31:57.206547: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 64
  dense_units_2: 112
  learning_rate: 0.01
  dropout_rate: 0.2
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.4
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
2025-08-09 16:32:03.724083: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:32:03.724450: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 64
  dense_units_2: 48
  learning_rate: 0.0001
  dropout_rate: 0.30000000000000004
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
2025-08-09 16:32:10.107997: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:32:10.108382: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 64
  dense_units_2: 96
  learning_rate: 0.01
  dropout_rate: 0.1
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.1
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
2025-08-09 16:32:15.231635: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:32:15.232059: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 16
  dense_units_2: 128
  learning_rate: 0.01
  dropout_rate: 0.1
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.1
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
/opt/anaconda3/envs/footballman/lib/python3.12/site-packages/keras/src/saving/saving_lib.py:757: UserWarning:

Skipping variable loading for optimizer 'adam', because it has 2 variables whereas the saved optimizer has 56 variables. 
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
1/1 ━━━━━━━━━━━━━━━━━━━━ 1s 544ms/step - loss: 27.7699 - mean_absolute_error: 4.0780
1: 27.76988410949707     2: 4.077999591827393  
27.76988410949707

FOLD 2
  player_emb_dim: 32
  dense_units: 16
  dense_units_2: 16
  learning_rate: 0.01
  dropout_rate: 0.1
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.1
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
  player_emb_dim: 32
  dense_units: 16
  dense_units_2: 96
  learning_rate: 0.01
  dropout_rate: 0.30000000000000004
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.4
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
2025-08-09 16:32:20.704025: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:32:20.704395: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 64
  dense_units_2: 128
  learning_rate: 0.01
  dropout_rate: 0.4
  dropout_rate_2: 0.4
  dropout_rate_inter: 0.2
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
2025-08-09 16:32:26.998043: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:32:26.998484: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 112
  dense_units_2: 80
  learning_rate: 0.001
  dropout_rate: 0.2
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.2
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 48
  dense_units_2: 32
  learning_rate: 0.01
  dropout_rate: 0.4
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.4
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
2025-08-09 16:32:34.509053: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:32:34.509394: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 64
  dense_units_2: 16
  learning_rate: 0.01
  dropout_rate: 0.4
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.4
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
2025-08-09 16:32:40.761901: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:32:40.762274: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 80
  dense_units_2: 32
  learning_rate: 0.01
  dropout_rate: 0.30000000000000004
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.1
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 128
  dense_units_2: 32
  learning_rate: 0.0001
  dropout_rate: 0.1
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.2
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
2025-08-09 16:32:47.684924: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:32:47.685288: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 112
  dense_units_2: 64
  learning_rate: 0.001
  dropout_rate: 0.4
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.1
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
2025-08-09 16:32:54.590841: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:32:54.591265: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 128
  dense_units_2: 16
  learning_rate: 0.0001
  dropout_rate: 0.30000000000000004
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
2025-08-09 16:33:01.663184: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:33:01.663561: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 128
  dense_units_2: 64
  learning_rate: 0.0001
  dropout_rate: 0.2
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.4
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
2025-08-09 16:33:09.717273: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:33:09.717556: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 64
  dense_units_2: 16
  learning_rate: 0.01
  dropout_rate: 0.4
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.4
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
2025-08-09 16:33:16.423746: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:33:16.424031: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 475ms/step - loss: 0.7098 - mean_absolute_error: 0.7726
1: 0.70982825756073     2: 0.7725679278373718  
0.70982825756073

FOLD 3
  player_emb_dim: 32
  dense_units: 16
  dense_units_2: 16
  learning_rate: 0.01
  dropout_rate: 0.1
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.1
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
  player_emb_dim: 32
  dense_units: 32
  dense_units_2: 80
  learning_rate: 0.001
  dropout_rate: 0.4
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 112
  dense_units_2: 32
  learning_rate: 0.0001
  dropout_rate: 0.1
  dropout_rate_2: 0.4
  dropout_rate_inter: 0.4
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
2025-08-09 16:33:22.860982: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:33:22.861338: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 16
  dense_units_2: 48
  learning_rate: 0.001
  dropout_rate: 0.1
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.4
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 32
  dense_units_2: 48
  learning_rate: 0.0001
  dropout_rate: 0.2
  dropout_rate_2: 0.4
  dropout_rate_inter: 0.2
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
2025-08-09 16:33:29.176063: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:33:29.176379: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 128
  dense_units_2: 32
  learning_rate: 0.001
  dropout_rate: 0.30000000000000004
  dropout_rate_2: 0.4
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 80
  dense_units_2: 32
  learning_rate: 0.001
  dropout_rate: 0.30000000000000004
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.2
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
2025-08-09 16:33:35.958161: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:33:35.958522: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 64
  dense_units_2: 48
  learning_rate: 0.0001
  dropout_rate: 0.1
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.2
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 128
  dense_units_2: 16
  learning_rate: 0.0001
  dropout_rate: 0.2
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.1
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
2025-08-09 16:33:42.384407: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:33:42.384739: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 80
  dense_units_2: 80
  learning_rate: 0.001
  dropout_rate: 0.4
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.1
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 112
  dense_units_2: 80
  learning_rate: 0.0001
  dropout_rate: 0.1
  dropout_rate_2: 0.4
  dropout_rate_inter: 0.1
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
2025-08-09 16:33:49.345132: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:33:49.345493: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 128
  dense_units_2: 16
  learning_rate: 0.0001
  dropout_rate: 0.2
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.1
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 404ms/step - loss: 2.7987 - mean_absolute_error: 1.2325
1: 2.7987241744995117     2: 1.2325010299682617  
2.7987241744995117

FOLD 4
  player_emb_dim: 32
  dense_units: 16
  dense_units_2: 16
  learning_rate: 0.01
  dropout_rate: 0.1
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.1
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
  player_emb_dim: 32
  dense_units: 96
  dense_units_2: 128
  learning_rate: 0.01
  dropout_rate: 0.1
  dropout_rate_2: 0.4
  dropout_rate_inter: 0.2
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 48
  dense_units_2: 96
  learning_rate: 0.01
  dropout_rate: 0.4
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.2
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
2025-08-09 16:33:54.811668: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:33:54.811954: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 16
  dense_units_2: 16
  learning_rate: 0.0001
  dropout_rate: 0.2
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
2025-08-09 16:33:59.864288: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:33:59.864642: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 64
  dense_units_2: 112
  learning_rate: 0.0001
  dropout_rate: 0.4
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.1
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
2025-08-09 16:34:05.066953: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:34:05.067282: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 16
  dense_units_2: 80
  learning_rate: 0.001
  dropout_rate: 0.4
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
2025-08-09 16:34:11.494176: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:34:11.494547: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 64
  dense_units_2: 64
  learning_rate: 0.0001
  dropout_rate: 0.2
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.4
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 32
  dense_units_2: 128
  learning_rate: 0.001
  dropout_rate: 0.1
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
2025-08-09 16:34:19.930401: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:34:19.930795: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 64
  dense_units_2: 48
  learning_rate: 0.0001
  dropout_rate: 0.2
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.1
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
2025-08-09 16:34:25.624913: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:34:25.625265: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 48
  dense_units_2: 112
  learning_rate: 0.0001
  dropout_rate: 0.4
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.2
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 48
  dense_units_2: 48
  learning_rate: 0.0001
  dropout_rate: 0.1
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
2025-08-09 16:34:32.010427: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:34:32.010759: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 48
  dense_units_2: 96
  learning_rate: 0.01
  dropout_rate: 0.4
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.2
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 446ms/step - loss: 2.2232 - mean_absolute_error: 1.0647
1: 2.2231674194335938     2: 1.0646657943725586  
2.2231674194335938

FOLD 5
  player_emb_dim: 32
  dense_units: 16
  dense_units_2: 16
  learning_rate: 0.01
  dropout_rate: 0.1
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.1
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
  player_emb_dim: 32
  dense_units: 64
  dense_units_2: 32
  learning_rate: 0.0001
  dropout_rate: 0.4
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 64)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 64)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 64)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 64)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 64)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 64)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 64)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 64)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 32
  dense_units_2: 16
  learning_rate: 0.01
  dropout_rate: 0.1
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 96)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 96)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 96)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 96)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 96)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 96)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 96)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 96)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 96)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
2025-08-09 16:34:42.865130: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:34:42.865507: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 64
  dense_units_2: 80
  learning_rate: 0.01
  dropout_rate: 0.30000000000000004
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 96)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 96)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 96)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 96)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 96)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 96)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 96)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 96)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 96)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 128
  dense_units_2: 80
  learning_rate: 0.0001
  dropout_rate: 0.2
  dropout_rate_2: 0.4
  dropout_rate_inter: 0.1
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 128)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 128)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 128)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 128)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 128)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 128)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 128)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 128)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
2025-08-09 16:34:49.948819: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:34:49.949187: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 64
  dense_units_2: 80
  learning_rate: 0.0001
  dropout_rate: 0.4
  dropout_rate_2: 0.4
  dropout_rate_inter: 0.1
  interaction_scale: 3
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 96)
teamA_combined.shape (None, 128)
teamB_combined.shape (None, 128)
matchup_vector.shape (None, 128)
match_input.shape (None, 384)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 96)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 96)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 96)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 96)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 96)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 96)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 96)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 96)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 96)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 96)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 96)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 96)
======================================================================================================================================================
2025-08-09 16:34:56.216638: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:34:56.217016: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 80
  dense_units_2: 48
  learning_rate: 0.01
  dropout_rate: 0.4
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 64)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 64)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 64)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 64)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 64)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 64)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 64)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 64)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
2025-08-09 16:35:02.297137: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:35:02.297421: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 32
  dense_units_2: 80
  learning_rate: 0.0001
  dropout_rate: 0.2
  dropout_rate_2: 0.4
  dropout_rate_inter: 0.1
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 128)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 128)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 128)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 128)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 128)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 128)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 128)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 128)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 16
  dense_units_2: 48
  learning_rate: 0.01
  dropout_rate: 0.30000000000000004
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.2
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 64)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 64)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 64)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 64)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 64)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 64)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 64)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 64)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
2025-08-09 16:35:08.623919: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:35:08.624219: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 48
  dense_units_2: 80
  learning_rate: 0.01
  dropout_rate: 0.4
  dropout_rate_2: 0.1
  dropout_rate_inter: 0.1
  interaction_scale: 4
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 128)
teamA_combined.shape (None, 160)
teamB_combined.shape (None, 160)
matchup_vector.shape (None, 160)
match_input.shape (None, 480)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 128)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 128)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 128)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 128)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 128)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 128)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 128)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 128)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 128)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 128)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 128)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 80
  dense_units_2: 32
  learning_rate: 0.0001
  dropout_rate: 0.4
  dropout_rate_2: 0.2
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 64)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 64)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 64)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 64)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 64)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 64)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (16, None)
j_batch shape: (16, None)
batch_size shape: ()
i_indices shape: (16, None)
pairwise:interaction (16, None, 64)
pairwise:valid_pairs_mask_expanded (16, None, 1)
pairwise:interaction_masked (16, None, 64)
pairwise:valid_counts (16, 1)
pairwise:pooled (16, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
2025-08-09 16:35:15.688110: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:35:15.688473: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
  player_emb_dim: 32
  dense_units: 80
  dense_units_2: 48
  learning_rate: 0.01
  dropout_rate: 0.4
  dropout_rate_2: 0.30000000000000004
  dropout_rate_inter: 0.30000000000000004
  interaction_scale: 2
teamA_embeds.shape (None, 9, 32)
teamA_attn.shape (None, 9, 32)
teamA_vector_pooled.shape (None, 32)
teamA_mask.shape (None, 9)
teamA_pairwise.shape (None, 64)
teamA_combined.shape (None, 96)
teamB_combined.shape (None, 96)
matchup_vector.shape (None, 96)
match_input.shape (None, 288)
Outcome
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
======================================== Starting pairwise interaction computation ==================================================
idx shape: (9,)
i shape: (9, 9)
upper_tri_mask shape: (9, 9)
i_pairs shape: (None,)
i_batch shape: (None, None)
j_batch shape: (None, None)
batch_size shape: ()
i_indices shape: (None, None)
pairwise:interaction (None, None, 64)
pairwise:valid_pairs_mask_expanded (None, None, 1)
pairwise:interaction_masked (None, None, 64)
pairwise:valid_counts (None, 1)
pairwise:pooled (None, 64)
======================================================================================================================================================
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
I matrix [[0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 ...
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]
 [0 1 2 ... 6 7 8]]
emb_i finished
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 391ms/step - loss: 1.4967 - mean_absolute_error: 1.0371
1: 1.496654987335205     2: 1.0371296405792236  
1.496654987335205

Best hyperparameters found:
player_emb_dim: 32
dense_units: 16
dense_units_2: 128
learning_rate: 0.01
dropout_rate: 0.1
dropout_rate_2: 0.30000000000000004
dropout_rate_inter: 0.1
interaction_scale: 2
model_inter_real, model_inter_real_train_loss = train_best_hps_model(all_best_hps_inter_real)
HPS: {'player_emb_dim': 32, 'dense_units': 16, 'dense_units_2': 128, 'learning_rate': 0.01, 'dropout_rate': 0.1, 'dropout_rate_2': 0.30000000000000004, 'dropout_rate_inter': 0.1, 'interaction_scale': 2}. MSE during RandomSearch: 27.76988410949707. Starting evaluation across all k folds...

FOLD 1
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 2s 2s/step - loss: 3.5973 - mean_absolute_error: 1.4554 - val_loss: 36.3455 - val_mean_absolute_error: 4.7837
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.5008 - mean_absolute_error: 1.2476 - val_loss: 32.9572 - val_mean_absolute_error: 4.4972
Epoch 3/50
2025-08-09 16:35:21.860342: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:35:21.860713: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.0052 - mean_absolute_error: 0.8165 - val_loss: 31.1545 - val_mean_absolute_error: 4.2246
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.3243 - mean_absolute_error: 1.0401 - val_loss: 43.2387 - val_mean_absolute_error: 5.3763
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 4.2560 - mean_absolute_error: 1.9587 - val_loss: 33.0719 - val_mean_absolute_error: 4.4607
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.2122 - mean_absolute_error: 0.7824 - val_loss: 32.8423 - val_mean_absolute_error: 4.5231
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.3788 - mean_absolute_error: 1.2476 - val_loss: 34.0589 - val_mean_absolute_error: 4.5492
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.8508 - mean_absolute_error: 1.0415 - val_loss: 35.8715 - val_mean_absolute_error: 4.6024
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.0457 - mean_absolute_error: 0.7588 - val_loss: 36.8773 - val_mean_absolute_error: 4.6479
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.4328 - mean_absolute_error: 0.5137 - val_loss: 37.2911 - val_mean_absolute_error: 4.6806
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1850 - mean_absolute_error: 0.3319 - val_loss: 37.4739 - val_mean_absolute_error: 4.7090
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.3521 - mean_absolute_error: 0.4378 - val_loss: 37.7726 - val_mean_absolute_error: 4.7608
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.3485 - mean_absolute_error: 0.3854 - val_loss: 38.4462 - val_mean_absolute_error: 4.8555
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 52ms/step - loss: 0.3818 - mean_absolute_error: 0.4634 - val_loss: 39.0109 - val_mean_absolute_error: 4.9117
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.2408 - mean_absolute_error: 0.3562 - val_loss: 39.9418 - val_mean_absolute_error: 4.9800
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.5882 - mean_absolute_error: 0.5623 - val_loss: 39.4534 - val_mean_absolute_error: 4.8982
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.2506 - mean_absolute_error: 0.3202 - val_loss: 38.5217 - val_mean_absolute_error: 4.7640
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1821 - mean_absolute_error: 0.3469 - val_loss: 38.1233 - val_mean_absolute_error: 4.6941
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.5203 - mean_absolute_error: 0.5226 - val_loss: 37.8719 - val_mean_absolute_error: 4.6697
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.7566 - mean_absolute_error: 0.6467 - val_loss: 37.9161 - val_mean_absolute_error: 4.6977
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.6876 - mean_absolute_error: 0.5602 - val_loss: 38.4018 - val_mean_absolute_error: 4.7779
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.3358 - mean_absolute_error: 0.3798 - val_loss: 38.6312 - val_mean_absolute_error: 4.8314
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.3593 - mean_absolute_error: 0.3674 - val_loss: 38.2754 - val_mean_absolute_error: 4.8023
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.1452 - mean_absolute_error: 0.3005 - val_loss: 38.1424 - val_mean_absolute_error: 4.7996
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.6429 - mean_absolute_error: 0.5863 - val_loss: 37.9126 - val_mean_absolute_error: 4.7773
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3554 - mean_absolute_error: 0.4032 - val_loss: 37.7633 - val_mean_absolute_error: 4.7382
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1459 - mean_absolute_error: 0.2853 - val_loss: 37.5557 - val_mean_absolute_error: 4.6844
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2415 - mean_absolute_error: 0.3209 - val_loss: 37.7464 - val_mean_absolute_error: 4.6661
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.3071 - mean_absolute_error: 0.3024 - val_loss: 38.0899 - val_mean_absolute_error: 4.6681
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.2798 - mean_absolute_error: 0.3359 - val_loss: 38.3233 - val_mean_absolute_error: 4.6662
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2355 - mean_absolute_error: 0.3372 - val_loss: 38.5726 - val_mean_absolute_error: 4.6798
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2004 - mean_absolute_error: 0.3296 - val_loss: 38.8811 - val_mean_absolute_error: 4.7067
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1267 - mean_absolute_error: 0.2619 - val_loss: 39.3293 - val_mean_absolute_error: 4.7443
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2219 - mean_absolute_error: 0.3127 - val_loss: 39.7634 - val_mean_absolute_error: 4.7882
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.1009 - mean_absolute_error: 0.2368 - val_loss: 39.9887 - val_mean_absolute_error: 4.8150
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.2137 - mean_absolute_error: 0.3563 - val_loss: 40.2516 - val_mean_absolute_error: 4.8502
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.4575 - mean_absolute_error: 0.4830 - val_loss: 39.9879 - val_mean_absolute_error: 4.8377
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.1852 - mean_absolute_error: 0.2666 - val_loss: 39.6795 - val_mean_absolute_error: 4.8112
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.1711 - mean_absolute_error: 0.2336 - val_loss: 39.1087 - val_mean_absolute_error: 4.7509
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.0887 - mean_absolute_error: 0.2063 - val_loss: 38.6159 - val_mean_absolute_error: 4.6913
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1634 - mean_absolute_error: 0.2170 - val_loss: 38.4717 - val_mean_absolute_error: 4.6673
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.1540 - mean_absolute_error: 0.2773 - val_loss: 38.2689 - val_mean_absolute_error: 4.6691
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.1375 - mean_absolute_error: 0.2522 - val_loss: 38.0424 - val_mean_absolute_error: 4.6640
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3742 - mean_absolute_error: 0.3579 - val_loss: 37.8393 - val_mean_absolute_error: 4.6714
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2907 - mean_absolute_error: 0.3842 - val_loss: 37.6628 - val_mean_absolute_error: 4.6732
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2146 - mean_absolute_error: 0.3172 - val_loss: 37.8256 - val_mean_absolute_error: 4.6777
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.6426 - mean_absolute_error: 0.3922 - val_loss: 38.0097 - val_mean_absolute_error: 4.6814
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.0269 - mean_absolute_error: 0.1097 - val_loss: 38.1981 - val_mean_absolute_error: 4.6817
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.6338 - mean_absolute_error: 0.4578 - val_loss: 38.5222 - val_mean_absolute_error: 4.7057
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.0682 - mean_absolute_error: 0.2084 - val_loss: 39.0428 - val_mean_absolute_error: 4.7459

FOLD 2
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 59ms/step - loss: 10.7477 - mean_absolute_error: 1.4317 - val_loss: 0.0113 - val_mean_absolute_error: 0.0644
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 8.6836 - mean_absolute_error: 1.2589 - val_loss: 0.0405 - val_mean_absolute_error: 0.1422
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 8.5342 - mean_absolute_error: 1.5483 - val_loss: 0.1751 - val_mean_absolute_error: 0.2711
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 8.1789 - mean_absolute_error: 1.8959 - val_loss: 0.1993 - val_mean_absolute_error: 0.3287
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 6.8435 - mean_absolute_error: 1.5139 - val_loss: 0.1236 - val_mean_absolute_error: 0.2563
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 66ms/step - loss: 6.3630 - mean_absolute_error: 1.3973 - val_loss: 0.0412 - val_mean_absolute_error: 0.1592
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 4.5332 - mean_absolute_error: 1.2808 - val_loss: 0.0214 - val_mean_absolute_error: 0.1048
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 3.5805 - mean_absolute_error: 1.1333 - val_loss: 0.3593 - val_mean_absolute_error: 0.4079
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 7.5361 - mean_absolute_error: 1.4804 - val_loss: 2.1774 - val_mean_absolute_error: 0.8709
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 6.1152 - mean_absolute_error: 1.3560 - val_loss: 6.0099 - val_mean_absolute_error: 1.3772
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 2.7265 - mean_absolute_error: 1.1821 - val_loss: 8.1837 - val_mean_absolute_error: 1.6040
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 1.7019 - mean_absolute_error: 0.8132 - val_loss: 10.3651 - val_mean_absolute_error: 1.8969
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 4.0086 - mean_absolute_error: 1.1856 - val_loss: 6.4182 - val_mean_absolute_error: 1.6294
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 3.4897 - mean_absolute_error: 1.2965 - val_loss: 4.7423 - val_mean_absolute_error: 1.4760
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 2.5428 - mean_absolute_error: 1.0913 - val_loss: 3.0936 - val_mean_absolute_error: 1.2530
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.9100 - mean_absolute_error: 0.8304 - val_loss: 2.2576 - val_mean_absolute_error: 1.0553
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.9303 - mean_absolute_error: 0.9177 - val_loss: 2.0995 - val_mean_absolute_error: 0.9004
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 2.2405 - mean_absolute_error: 1.0512 - val_loss: 2.4863 - val_mean_absolute_error: 0.9212
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.7389 - mean_absolute_error: 0.8426 - val_loss: 2.6217 - val_mean_absolute_error: 0.9594
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.9099 - mean_absolute_error: 0.9369 - val_loss: 3.0333 - val_mean_absolute_error: 1.0823
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.7533 - mean_absolute_error: 0.9114 - val_loss: 4.1841 - val_mean_absolute_error: 1.2434
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.4985 - mean_absolute_error: 0.7456 - val_loss: 5.9704 - val_mean_absolute_error: 1.4280
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.8696 - mean_absolute_error: 0.8614 - val_loss: 6.1268 - val_mean_absolute_error: 1.5098
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.7993 - mean_absolute_error: 0.8809 - val_loss: 5.7353 - val_mean_absolute_error: 1.5946
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.7418 - mean_absolute_error: 0.8395 - val_loss: 4.1150 - val_mean_absolute_error: 1.4455
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.3960 - mean_absolute_error: 0.6955 - val_loss: 3.0300 - val_mean_absolute_error: 1.3023
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.5273 - mean_absolute_error: 0.7657 - val_loss: 2.5239 - val_mean_absolute_error: 1.2161
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.8716 - mean_absolute_error: 0.9981 - val_loss: 2.2320 - val_mean_absolute_error: 1.1350
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 6.0929 - mean_absolute_error: 1.2757 - val_loss: 2.3136 - val_mean_absolute_error: 1.1298
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.8901 - mean_absolute_error: 0.8449 - val_loss: 2.7754 - val_mean_absolute_error: 1.0954
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.4711 - mean_absolute_error: 0.7844 - val_loss: 3.2100 - val_mean_absolute_error: 1.0196
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.4447 - mean_absolute_error: 0.7707 - val_loss: 3.0341 - val_mean_absolute_error: 0.9729
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.9574 - mean_absolute_error: 1.0504 - val_loss: 3.7499 - val_mean_absolute_error: 1.0745
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 3.2066 - mean_absolute_error: 1.2238 - val_loss: 5.1191 - val_mean_absolute_error: 1.3482
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 102ms/step - loss: 2.6748 - mean_absolute_error: 0.9355 - val_loss: 3.9295 - val_mean_absolute_error: 1.3499
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 56ms/step - loss: 3.1960 - mean_absolute_error: 1.2243 - val_loss: 1.5818 - val_mean_absolute_error: 0.9353
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 55ms/step - loss: 1.6051 - mean_absolute_error: 0.8539 - val_loss: 0.5900 - val_mean_absolute_error: 0.6280
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 2.8978 - mean_absolute_error: 1.1598 - val_loss: 0.3779 - val_mean_absolute_error: 0.5236
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.6323 - mean_absolute_error: 0.9048 - val_loss: 0.3617 - val_mean_absolute_error: 0.5385
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 3.0092 - mean_absolute_error: 1.0202 - val_loss: 0.6302 - val_mean_absolute_error: 0.7189
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 3.3827 - mean_absolute_error: 1.0342 - val_loss: 1.0099 - val_mean_absolute_error: 0.9362
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.2022 - mean_absolute_error: 0.9805 - val_loss: 1.0656 - val_mean_absolute_error: 0.9506
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.0778 - mean_absolute_error: 1.0762 - val_loss: 0.9394 - val_mean_absolute_error: 0.8039
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.5363 - mean_absolute_error: 0.8498 - val_loss: 0.8563 - val_mean_absolute_error: 0.7010
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.3185 - mean_absolute_error: 0.6896 - val_loss: 0.7388 - val_mean_absolute_error: 0.6374
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.6839 - mean_absolute_error: 0.8529 - val_loss: 0.6336 - val_mean_absolute_error: 0.6420
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.3593 - mean_absolute_error: 0.6897 - val_loss: 0.8489 - val_mean_absolute_error: 0.8023
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.2082 - mean_absolute_error: 0.6984 - val_loss: 0.9856 - val_mean_absolute_error: 0.8906
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.5987 - mean_absolute_error: 0.8295 - val_loss: 1.5882 - val_mean_absolute_error: 1.1653
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.2881 - mean_absolute_error: 0.7508 - val_loss: 1.5730 - val_mean_absolute_error: 1.1609

FOLD 3
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 60ms/step - loss: 1.9054 - mean_absolute_error: 0.9380 - val_loss: 0.3381 - val_mean_absolute_error: 0.4912
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.1490 - mean_absolute_error: 0.6931 - val_loss: 0.6341 - val_mean_absolute_error: 0.6028
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.6089 - mean_absolute_error: 0.8468 - val_loss: 0.7661 - val_mean_absolute_error: 0.7004
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.7235 - mean_absolute_error: 0.9776 - val_loss: 0.5626 - val_mean_absolute_error: 0.5657
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.8148 - mean_absolute_error: 1.1643 - val_loss: 0.3619 - val_mean_absolute_error: 0.4687
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.6806 - mean_absolute_error: 0.9179 - val_loss: 0.2471 - val_mean_absolute_error: 0.4129
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 1.3894 - mean_absolute_error: 0.8111 - val_loss: 0.1834 - val_mean_absolute_error: 0.3485
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 1.3124 - mean_absolute_error: 0.7878 - val_loss: 0.1810 - val_mean_absolute_error: 0.3436
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 2.6760 - mean_absolute_error: 1.0361 - val_loss: 0.2611 - val_mean_absolute_error: 0.3655
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 1.3664 - mean_absolute_error: 0.8494 - val_loss: 0.3599 - val_mean_absolute_error: 0.4760
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.0588 - mean_absolute_error: 0.6633 - val_loss: 0.4598 - val_mean_absolute_error: 0.5887
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.1569 - mean_absolute_error: 0.6568 - val_loss: 0.4554 - val_mean_absolute_error: 0.5438
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.0560 - mean_absolute_error: 0.9801 - val_loss: 0.4136 - val_mean_absolute_error: 0.5157
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.9642 - mean_absolute_error: 0.6292 - val_loss: 0.4172 - val_mean_absolute_error: 0.5973
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.7968 - mean_absolute_error: 0.4643 - val_loss: 0.4533 - val_mean_absolute_error: 0.6315
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.0194 - mean_absolute_error: 0.6678 - val_loss: 0.4805 - val_mean_absolute_error: 0.6286
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.1208 - mean_absolute_error: 0.8720 - val_loss: 0.5068 - val_mean_absolute_error: 0.6282
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.2924 - mean_absolute_error: 0.7855 - val_loss: 0.5293 - val_mean_absolute_error: 0.6300
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.1382 - mean_absolute_error: 0.6403 - val_loss: 0.4819 - val_mean_absolute_error: 0.5706
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.8443 - mean_absolute_error: 0.5432 - val_loss: 0.3992 - val_mean_absolute_error: 0.4637
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.8118 - mean_absolute_error: 0.5675 - val_loss: 0.3253 - val_mean_absolute_error: 0.3645
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.0569 - mean_absolute_error: 0.6964 - val_loss: 0.2609 - val_mean_absolute_error: 0.3663
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.1954 - mean_absolute_error: 0.6970 - val_loss: 0.2340 - val_mean_absolute_error: 0.4287
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.8695 - mean_absolute_error: 0.6204 - val_loss: 0.2310 - val_mean_absolute_error: 0.4383
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.2191 - mean_absolute_error: 0.7813 - val_loss: 0.2388 - val_mean_absolute_error: 0.4218
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.1850 - mean_absolute_error: 0.8121 - val_loss: 0.1163 - val_mean_absolute_error: 0.2814
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.7051 - mean_absolute_error: 0.5669 - val_loss: 0.1059 - val_mean_absolute_error: 0.2995
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.7098 - mean_absolute_error: 0.5515 - val_loss: 0.1377 - val_mean_absolute_error: 0.3450
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.5887 - mean_absolute_error: 0.4510 - val_loss: 0.2027 - val_mean_absolute_error: 0.3910
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.6455 - mean_absolute_error: 0.5645 - val_loss: 0.2980 - val_mean_absolute_error: 0.4835
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.2365 - mean_absolute_error: 0.8241 - val_loss: 0.4117 - val_mean_absolute_error: 0.5829
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.8404 - mean_absolute_error: 0.6642 - val_loss: 0.5259 - val_mean_absolute_error: 0.6098
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.0294 - mean_absolute_error: 0.7329 - val_loss: 0.5695 - val_mean_absolute_error: 0.5727
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.0276 - mean_absolute_error: 0.7176 - val_loss: 0.5134 - val_mean_absolute_error: 0.5206
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 2.5736 - mean_absolute_error: 1.0210 - val_loss: 0.4257 - val_mean_absolute_error: 0.5088
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.4562 - mean_absolute_error: 0.4803 - val_loss: 0.3422 - val_mean_absolute_error: 0.4981
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 8.9488 - mean_absolute_error: 1.1709 - val_loss: 0.2877 - val_mean_absolute_error: 0.4909
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.3184 - mean_absolute_error: 0.8691 - val_loss: 0.2347 - val_mean_absolute_error: 0.4238
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 0.9516 - mean_absolute_error: 0.7900 - val_loss: 0.2619 - val_mean_absolute_error: 0.4350
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.8051 - mean_absolute_error: 0.6813 - val_loss: 0.3179 - val_mean_absolute_error: 0.4789
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.4892 - mean_absolute_error: 0.4578 - val_loss: 0.3991 - val_mean_absolute_error: 0.5537
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.5213 - mean_absolute_error: 0.5527 - val_loss: 0.4817 - val_mean_absolute_error: 0.6118
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.9424 - mean_absolute_error: 0.7145 - val_loss: 0.4822 - val_mean_absolute_error: 0.5777
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.5081 - mean_absolute_error: 0.5498 - val_loss: 0.4515 - val_mean_absolute_error: 0.5868
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.8776 - mean_absolute_error: 0.6319 - val_loss: 0.3935 - val_mean_absolute_error: 0.5815
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.4341 - mean_absolute_error: 0.4368 - val_loss: 0.3385 - val_mean_absolute_error: 0.5382
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.6939 - mean_absolute_error: 0.6199 - val_loss: 0.3174 - val_mean_absolute_error: 0.4693
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2700 - mean_absolute_error: 0.3517 - val_loss: 0.3496 - val_mean_absolute_error: 0.5040
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.3679 - mean_absolute_error: 0.4350 - val_loss: 0.3780 - val_mean_absolute_error: 0.5216
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.4602 - mean_absolute_error: 0.5286 - val_loss: 0.3295 - val_mean_absolute_error: 0.4990

FOLD 4
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 62ms/step - loss: 0.7341 - mean_absolute_error: 0.6293 - val_loss: 0.0372 - val_mean_absolute_error: 0.1417
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.4292 - mean_absolute_error: 0.5274 - val_loss: 0.0186 - val_mean_absolute_error: 0.0797
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 1.1406 - mean_absolute_error: 0.7421 - val_loss: 0.0611 - val_mean_absolute_error: 0.1597
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.3562 - mean_absolute_error: 0.4475 - val_loss: 0.1197 - val_mean_absolute_error: 0.2664
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 0.4182 - mean_absolute_error: 0.5049
2025-08-09 16:35:28.138689: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:35:28.139062: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.4182 - mean_absolute_error: 0.5049 - val_loss: 0.1260 - val_mean_absolute_error: 0.2851
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.6148 - mean_absolute_error: 0.8944 - val_loss: 0.0741 - val_mean_absolute_error: 0.2349
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.8543 - mean_absolute_error: 0.5925 - val_loss: 0.1096 - val_mean_absolute_error: 0.2581
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.4630 - mean_absolute_error: 0.7361 - val_loss: 0.1298 - val_mean_absolute_error: 0.2884
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.4201 - mean_absolute_error: 0.4457 - val_loss: 0.1378 - val_mean_absolute_error: 0.3153
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.5921 - mean_absolute_error: 0.5440 - val_loss: 0.1371 - val_mean_absolute_error: 0.3329
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2881 - mean_absolute_error: 0.3594 - val_loss: 0.1696 - val_mean_absolute_error: 0.3371
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3361 - mean_absolute_error: 0.3848 - val_loss: 0.2128 - val_mean_absolute_error: 0.3283
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.7416 - mean_absolute_error: 0.5887 - val_loss: 0.2267 - val_mean_absolute_error: 0.3501
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.1912 - mean_absolute_error: 0.6755 - val_loss: 0.2239 - val_mean_absolute_error: 0.4286
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.6318 - mean_absolute_error: 0.5303 - val_loss: 0.2676 - val_mean_absolute_error: 0.4917
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.9831 - mean_absolute_error: 0.8065 - val_loss: 0.3248 - val_mean_absolute_error: 0.5400
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.4200 - mean_absolute_error: 0.4665 - val_loss: 0.3338 - val_mean_absolute_error: 0.5249
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.5208 - mean_absolute_error: 0.5984 - val_loss: 0.4296 - val_mean_absolute_error: 0.5510
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.6260 - mean_absolute_error: 0.6115 - val_loss: 0.5800 - val_mean_absolute_error: 0.6331
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.5517 - mean_absolute_error: 0.4151 - val_loss: 0.5463 - val_mean_absolute_error: 0.6086
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.5229 - mean_absolute_error: 0.5573 - val_loss: 0.4380 - val_mean_absolute_error: 0.5418
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.6016 - mean_absolute_error: 0.6408 - val_loss: 0.3673 - val_mean_absolute_error: 0.4887
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1889 - mean_absolute_error: 0.3490 - val_loss: 0.3446 - val_mean_absolute_error: 0.4585
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3303 - mean_absolute_error: 0.4156 - val_loss: 0.4083 - val_mean_absolute_error: 0.4951
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.6773 - mean_absolute_error: 0.5579 - val_loss: 0.3457 - val_mean_absolute_error: 0.4311
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2083 - mean_absolute_error: 0.3529 - val_loss: 0.2879 - val_mean_absolute_error: 0.4078
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3384 - mean_absolute_error: 0.4227 - val_loss: 0.1845 - val_mean_absolute_error: 0.3314
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.1546 - mean_absolute_error: 0.6628 - val_loss: 0.1876 - val_mean_absolute_error: 0.2635
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.2733 - mean_absolute_error: 0.4121 - val_loss: 0.3352 - val_mean_absolute_error: 0.4720
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.4625 - mean_absolute_error: 0.4789 - val_loss: 0.4193 - val_mean_absolute_error: 0.5398
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.8756 - mean_absolute_error: 0.7350 - val_loss: 0.3888 - val_mean_absolute_error: 0.4924
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.6532 - mean_absolute_error: 0.5788 - val_loss: 0.2292 - val_mean_absolute_error: 0.3750
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.2520 - mean_absolute_error: 0.3954 - val_loss: 0.1639 - val_mean_absolute_error: 0.3204
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.6620 - mean_absolute_error: 0.6637 - val_loss: 0.1892 - val_mean_absolute_error: 0.3100
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.9665 - mean_absolute_error: 0.6122 - val_loss: 0.2153 - val_mean_absolute_error: 0.3497
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.5605 - mean_absolute_error: 0.4930 - val_loss: 0.3096 - val_mean_absolute_error: 0.4638
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.0268 - mean_absolute_error: 0.6626 - val_loss: 0.3425 - val_mean_absolute_error: 0.5194
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.5998 - mean_absolute_error: 0.4881 - val_loss: 0.3658 - val_mean_absolute_error: 0.5516
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2245 - mean_absolute_error: 0.3873 - val_loss: 0.4142 - val_mean_absolute_error: 0.5848
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2986 - mean_absolute_error: 0.4289 - val_loss: 0.4519 - val_mean_absolute_error: 0.6060
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.3629 - mean_absolute_error: 0.4969 - val_loss: 0.4745 - val_mean_absolute_error: 0.6203
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.3696 - mean_absolute_error: 0.4342 - val_loss: 0.4807 - val_mean_absolute_error: 0.6351
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.2003 - mean_absolute_error: 0.6618 - val_loss: 0.4880 - val_mean_absolute_error: 0.5873
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1254 - mean_absolute_error: 0.2725 - val_loss: 0.5523 - val_mean_absolute_error: 0.6673
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2027 - mean_absolute_error: 0.3588 - val_loss: 0.6575 - val_mean_absolute_error: 0.7446
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.5147 - mean_absolute_error: 0.5600 - val_loss: 0.7224 - val_mean_absolute_error: 0.7836
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3556 - mean_absolute_error: 0.4863 - val_loss: 0.8158 - val_mean_absolute_error: 0.8046
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.5962 - mean_absolute_error: 0.4995 - val_loss: 0.7709 - val_mean_absolute_error: 0.7762
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.5078 - mean_absolute_error: 0.4936 - val_loss: 0.6686 - val_mean_absolute_error: 0.7050
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2634 - mean_absolute_error: 0.3789 - val_loss: 0.5785 - val_mean_absolute_error: 0.6591

FOLD 5
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 58ms/step - loss: 0.9964 - mean_absolute_error: 0.6986 - val_loss: 0.1803 - val_mean_absolute_error: 0.3788
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.2648 - mean_absolute_error: 0.4248 - val_loss: 0.1944 - val_mean_absolute_error: 0.4106
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 2.2719 - mean_absolute_error: 0.9995 - val_loss: 0.1644 - val_mean_absolute_error: 0.3409
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.3703 - mean_absolute_error: 0.4646 - val_loss: 0.1803 - val_mean_absolute_error: 0.4026
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.3836 - mean_absolute_error: 0.4117 - val_loss: 0.2408 - val_mean_absolute_error: 0.4824
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.7455 - mean_absolute_error: 0.5555 - val_loss: 0.2808 - val_mean_absolute_error: 0.5219
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.5336 - mean_absolute_error: 0.6150 - val_loss: 0.2665 - val_mean_absolute_error: 0.5047
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.2721 - mean_absolute_error: 0.4152 - val_loss: 0.2530 - val_mean_absolute_error: 0.4560
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.2154 - mean_absolute_error: 0.3296 - val_loss: 0.2650 - val_mean_absolute_error: 0.4026
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.3274 - mean_absolute_error: 0.4485 - val_loss: 0.2979 - val_mean_absolute_error: 0.4428
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.6624 - mean_absolute_error: 0.6563 - val_loss: 0.3294 - val_mean_absolute_error: 0.4914
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.4304 - mean_absolute_error: 0.5476 - val_loss: 0.3593 - val_mean_absolute_error: 0.5419
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - loss: 0.1422 - mean_absolute_error: 0.3030 - val_loss: 0.4068 - val_mean_absolute_error: 0.5833
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.8023 - mean_absolute_error: 0.5925 - val_loss: 0.3857 - val_mean_absolute_error: 0.5451
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.9486 - mean_absolute_error: 0.6277 - val_loss: 0.3158 - val_mean_absolute_error: 0.4695
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.4024 - mean_absolute_error: 0.4931 - val_loss: 0.2428 - val_mean_absolute_error: 0.3896
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.0793 - mean_absolute_error: 0.2172 - val_loss: 0.1786 - val_mean_absolute_error: 0.3239
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2403 - mean_absolute_error: 0.3306 - val_loss: 0.1219 - val_mean_absolute_error: 0.2588
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.1522 - mean_absolute_error: 0.3258 - val_loss: 0.0836 - val_mean_absolute_error: 0.2155
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.1687 - mean_absolute_error: 0.3168 - val_loss: 0.0702 - val_mean_absolute_error: 0.2194
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.4137 - mean_absolute_error: 0.4272 - val_loss: 0.0697 - val_mean_absolute_error: 0.2165
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2850 - mean_absolute_error: 0.4370 - val_loss: 0.0585 - val_mean_absolute_error: 0.1999
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.1358 - mean_absolute_error: 0.2861 - val_loss: 0.0528 - val_mean_absolute_error: 0.2073
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.1575 - mean_absolute_error: 0.3129 - val_loss: 0.0571 - val_mean_absolute_error: 0.2222
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.2193 - mean_absolute_error: 0.3042 - val_loss: 0.0590 - val_mean_absolute_error: 0.2250
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.5087 - mean_absolute_error: 0.4788 - val_loss: 0.0668 - val_mean_absolute_error: 0.2411
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 3.4478 - mean_absolute_error: 0.8753 - val_loss: 0.0869 - val_mean_absolute_error: 0.2673
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.2999 - mean_absolute_error: 0.4020 - val_loss: 0.1184 - val_mean_absolute_error: 0.3027
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1802 - mean_absolute_error: 0.3174 - val_loss: 0.1402 - val_mean_absolute_error: 0.3280
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.1861 - mean_absolute_error: 0.3054 - val_loss: 0.1632 - val_mean_absolute_error: 0.3603
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.1607 - mean_absolute_error: 0.2748 - val_loss: 0.1708 - val_mean_absolute_error: 0.3687
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.2233 - mean_absolute_error: 0.3212 - val_loss: 0.1631 - val_mean_absolute_error: 0.3563
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1192 - mean_absolute_error: 0.2786 - val_loss: 0.1373 - val_mean_absolute_error: 0.3230
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.1952 - mean_absolute_error: 0.2954 - val_loss: 0.1120 - val_mean_absolute_error: 0.2861
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.4922 - mean_absolute_error: 0.4150 - val_loss: 0.0891 - val_mean_absolute_error: 0.2287
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.1519 - mean_absolute_error: 0.2911 - val_loss: 0.0688 - val_mean_absolute_error: 0.1900
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3454 - mean_absolute_error: 0.4570 - val_loss: 0.0651 - val_mean_absolute_error: 0.1705
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3534 - mean_absolute_error: 0.4466 - val_loss: 0.0723 - val_mean_absolute_error: 0.2269
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.2473 - mean_absolute_error: 0.3379 - val_loss: 0.0878 - val_mean_absolute_error: 0.2490
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3127 - mean_absolute_error: 0.4469 - val_loss: 0.0887 - val_mean_absolute_error: 0.2448
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.3113 - mean_absolute_error: 0.3438 - val_loss: 0.0909 - val_mean_absolute_error: 0.2652
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.4447 - mean_absolute_error: 0.3918 - val_loss: 0.0672 - val_mean_absolute_error: 0.2045
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.1544 - mean_absolute_error: 0.3308 - val_loss: 0.0716 - val_mean_absolute_error: 0.1970
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.1525 - mean_absolute_error: 0.2784 - val_loss: 0.1337 - val_mean_absolute_error: 0.3295
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.3547 - mean_absolute_error: 0.4121 - val_loss: 0.2053 - val_mean_absolute_error: 0.3985
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.1624 - mean_absolute_error: 0.2990 - val_loss: 0.2672 - val_mean_absolute_error: 0.4319
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.2771 - mean_absolute_error: 0.4198 - val_loss: 0.3024 - val_mean_absolute_error: 0.4588
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.2836 - mean_absolute_error: 0.3840 - val_loss: 0.3297 - val_mean_absolute_error: 0.5185
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.6015 - mean_absolute_error: 0.4956 - val_loss: 0.4006 - val_mean_absolute_error: 0.6090
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.2880 - mean_absolute_error: 0.4117 - val_loss: 0.4222 - val_mean_absolute_error: 0.6278
Validation losses: [39.04275131225586, 1.5729528665542603, 0.3294881284236908, 0.5785070657730103, 0.4222172796726227]
HPS: {'player_emb_dim': 32, 'dense_units': 64, 'dense_units_2': 16, 'learning_rate': 0.01, 'dropout_rate': 0.4, 'dropout_rate_2': 0.2, 'dropout_rate_inter': 0.4, 'interaction_scale': 4}. MSE during RandomSearch: 0.70982825756073. Starting evaluation across all k folds...

FOLD 1
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 2s 2s/step - loss: 3.3259 - mean_absolute_error: 1.5069 - val_loss: 34.0124 - val_mean_absolute_error: 4.4534
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 4.6876 - mean_absolute_error: 1.6685 - val_loss: 39.4737 - val_mean_absolute_error: 5.0167
Epoch 3/50
2025-08-09 16:35:33.589515: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:35:33.589874: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 3.7617 - mean_absolute_error: 1.6506 - val_loss: 36.6969 - val_mean_absolute_error: 4.7278
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.2414 - mean_absolute_error: 1.2074 - val_loss: 35.0529 - val_mean_absolute_error: 4.4827
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.6778 - mean_absolute_error: 1.3052 - val_loss: 36.5621 - val_mean_absolute_error: 4.6261
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.2398 - mean_absolute_error: 0.9215 - val_loss: 37.7851 - val_mean_absolute_error: 4.7133
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.9400 - mean_absolute_error: 1.0134 - val_loss: 39.2468 - val_mean_absolute_error: 4.7752
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.0469 - mean_absolute_error: 0.9132 - val_loss: 41.5880 - val_mean_absolute_error: 5.0971
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.1578 - mean_absolute_error: 1.1897 - val_loss: 41.5787 - val_mean_absolute_error: 5.0605
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.3564 - mean_absolute_error: 1.2683 - val_loss: 40.0501 - val_mean_absolute_error: 4.8223
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.1436 - mean_absolute_error: 1.4018 - val_loss: 39.0525 - val_mean_absolute_error: 4.9963
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.1229 - mean_absolute_error: 0.8425 - val_loss: 38.6334 - val_mean_absolute_error: 5.0724
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 3.1006 - mean_absolute_error: 1.2062 - val_loss: 37.4819 - val_mean_absolute_error: 4.9607
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.3697 - mean_absolute_error: 1.2090 - val_loss: 36.5511 - val_mean_absolute_error: 4.8376
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.1389 - mean_absolute_error: 1.2342 - val_loss: 34.5767 - val_mean_absolute_error: 4.4285
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.9582 - mean_absolute_error: 1.1425 - val_loss: 36.0124 - val_mean_absolute_error: 4.5288
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.5498 - mean_absolute_error: 1.3912 - val_loss: 36.8915 - val_mean_absolute_error: 4.6164
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.7223 - mean_absolute_error: 0.9318 - val_loss: 37.3172 - val_mean_absolute_error: 4.6631
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.0087 - mean_absolute_error: 0.9718 - val_loss: 37.5282 - val_mean_absolute_error: 4.6882
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.2918 - mean_absolute_error: 0.8670 - val_loss: 36.8075 - val_mean_absolute_error: 4.6606
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.3711 - mean_absolute_error: 0.8306 - val_loss: 36.6382 - val_mean_absolute_error: 4.6528
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.0885 - mean_absolute_error: 0.7570 - val_loss: 36.7834 - val_mean_absolute_error: 4.6590
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.6491 - mean_absolute_error: 1.1269 - val_loss: 36.8998 - val_mean_absolute_error: 4.6481
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.6511 - mean_absolute_error: 0.6286 - val_loss: 36.9532 - val_mean_absolute_error: 4.6205
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.8438 - mean_absolute_error: 1.0615 - val_loss: 36.9781 - val_mean_absolute_error: 4.5755
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.6322 - mean_absolute_error: 0.6353 - val_loss: 37.0666 - val_mean_absolute_error: 4.6403
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.7624 - mean_absolute_error: 0.7482 - val_loss: 37.4019 - val_mean_absolute_error: 4.7188
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.0521 - mean_absolute_error: 1.0336 - val_loss: 37.8190 - val_mean_absolute_error: 4.7864
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.1915 - mean_absolute_error: 0.8575 - val_loss: 38.2971 - val_mean_absolute_error: 4.8252
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.7809 - mean_absolute_error: 0.6409 - val_loss: 38.8905 - val_mean_absolute_error: 4.8759
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.7789 - mean_absolute_error: 1.0404 - val_loss: 38.9254 - val_mean_absolute_error: 4.8673
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.7144 - mean_absolute_error: 1.0126 - val_loss: 38.7922 - val_mean_absolute_error: 4.8328
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.5369 - mean_absolute_error: 0.4911 - val_loss: 38.7164 - val_mean_absolute_error: 4.8023
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.7311 - mean_absolute_error: 0.9183 - val_loss: 38.7995 - val_mean_absolute_error: 4.7856
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.4544 - mean_absolute_error: 0.8929 - val_loss: 38.6594 - val_mean_absolute_error: 4.7444
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.9000 - mean_absolute_error: 0.6591 - val_loss: 38.6705 - val_mean_absolute_error: 4.7195
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.9776 - mean_absolute_error: 0.7475 - val_loss: 38.7067 - val_mean_absolute_error: 4.7014
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.2184 - mean_absolute_error: 0.8977 - val_loss: 38.7330 - val_mean_absolute_error: 4.6961
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.9053 - mean_absolute_error: 0.7694 - val_loss: 38.7108 - val_mean_absolute_error: 4.6921
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.5795 - mean_absolute_error: 0.5687 - val_loss: 38.7412 - val_mean_absolute_error: 4.7148
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - loss: 0.9089 - mean_absolute_error: 0.6927 - val_loss: 38.8242 - val_mean_absolute_error: 4.7465
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.5490 - mean_absolute_error: 0.4966 - val_loss: 38.9388 - val_mean_absolute_error: 4.7854
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.9534 - mean_absolute_error: 0.6049 - val_loss: 39.2315 - val_mean_absolute_error: 4.8256
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.3455 - mean_absolute_error: 0.8041 - val_loss: 39.2293 - val_mean_absolute_error: 4.8217
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.0534 - mean_absolute_error: 0.7705 - val_loss: 38.9496 - val_mean_absolute_error: 4.7916
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 0.7284 - mean_absolute_error: 0.6119 - val_loss: 38.7809 - val_mean_absolute_error: 4.7737
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.8308 - mean_absolute_error: 0.6696 - val_loss: 38.4936 - val_mean_absolute_error: 4.7411
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.1235 - mean_absolute_error: 0.7894 - val_loss: 38.4386 - val_mean_absolute_error: 4.7288
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.1111 - mean_absolute_error: 0.7640 - val_loss: 38.3852 - val_mean_absolute_error: 4.7171
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.7709 - mean_absolute_error: 0.5363 - val_loss: 38.4668 - val_mean_absolute_error: 4.7220

FOLD 2
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 59ms/step - loss: 10.9247 - mean_absolute_error: 1.6976 - val_loss: 0.6218 - val_mean_absolute_error: 0.7011
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 10.4574 - mean_absolute_error: 1.5200 - val_loss: 0.7976 - val_mean_absolute_error: 0.7701
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 10.4609 - mean_absolute_error: 1.7675 - val_loss: 0.9805 - val_mean_absolute_error: 0.8695
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 12.0112 - mean_absolute_error: 1.9966 - val_loss: 1.2284 - val_mean_absolute_error: 0.9721
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 10.0545 - mean_absolute_error: 1.6928 - val_loss: 1.4713 - val_mean_absolute_error: 1.0691
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 9.8403 - mean_absolute_error: 1.7185 - val_loss: 1.6967 - val_mean_absolute_error: 1.1604
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 10.1214 - mean_absolute_error: 1.8106 - val_loss: 1.7587 - val_mean_absolute_error: 1.1967
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 10.0469 - mean_absolute_error: 1.7170 - val_loss: 1.7525 - val_mean_absolute_error: 1.2070
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 10.7090 - mean_absolute_error: 1.8668 - val_loss: 1.6260 - val_mean_absolute_error: 1.1736
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 9.7675 - mean_absolute_error: 1.8641 - val_loss: 1.5083 - val_mean_absolute_error: 1.1645
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 9.4464 - mean_absolute_error: 1.8853 - val_loss: 1.4216 - val_mean_absolute_error: 1.1621
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 7.1142 - mean_absolute_error: 1.7415 - val_loss: 1.3401 - val_mean_absolute_error: 1.1438
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 9.4011 - mean_absolute_error: 1.7720 - val_loss: 1.2196 - val_mean_absolute_error: 1.0715
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 7.0988 - mean_absolute_error: 1.7447 - val_loss: 1.1548 - val_mean_absolute_error: 0.9792
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 10.5194 - mean_absolute_error: 1.9387 - val_loss: 1.1165 - val_mean_absolute_error: 0.9099
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 9.7245 - mean_absolute_error: 2.2001 - val_loss: 0.8351 - val_mean_absolute_error: 0.7420
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 10.7543 - mean_absolute_error: 2.0270 - val_loss: 0.4630 - val_mean_absolute_error: 0.5629
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 5.4703 - mean_absolute_error: 1.6024 - val_loss: 0.2439 - val_mean_absolute_error: 0.4317
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 8.2378 - mean_absolute_error: 1.5330 - val_loss: 0.1695 - val_mean_absolute_error: 0.3530
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 4.0451 - mean_absolute_error: 1.4001 - val_loss: 0.3811 - val_mean_absolute_error: 0.4997
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.7875 - mean_absolute_error: 1.5284 - val_loss: 1.0185 - val_mean_absolute_error: 0.7494
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 6.3657 - mean_absolute_error: 1.5614 - val_loss: 2.4298 - val_mean_absolute_error: 1.0641
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.3846 - mean_absolute_error: 1.1514 - val_loss: 4.1578 - val_mean_absolute_error: 1.4743
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 5.0581 - mean_absolute_error: 1.4702 - val_loss: 3.7441 - val_mean_absolute_error: 1.5068
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.1066 - mean_absolute_error: 1.0680 - val_loss: 2.8046 - val_mean_absolute_error: 1.4090
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 4.8231 - mean_absolute_error: 1.6666 - val_loss: 2.3563 - val_mean_absolute_error: 1.3335
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 4.1749 - mean_absolute_error: 1.4871 - val_loss: 1.9998 - val_mean_absolute_error: 1.2458
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 3.8169 - mean_absolute_error: 1.3173 - val_loss: 1.9972 - val_mean_absolute_error: 1.2349
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 5.0339 - mean_absolute_error: 1.6545 - val_loss: 2.3646 - val_mean_absolute_error: 1.3248
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 6.3621 - mean_absolute_error: 1.6445 - val_loss: 3.3593 - val_mean_absolute_error: 1.5555
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 5.9441 - mean_absolute_error: 1.4354 - val_loss: 5.1793 - val_mean_absolute_error: 1.8621
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 2.2762 - mean_absolute_error: 1.1401 - val_loss: 6.3311 - val_mean_absolute_error: 2.0407
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.3244 - mean_absolute_error: 1.1352 - val_loss: 7.5132 - val_mean_absolute_error: 2.2027
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.7176 - mean_absolute_error: 0.8379 - val_loss: 8.4186 - val_mean_absolute_error: 2.3065
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 5.8473 - mean_absolute_error: 1.2595 - val_loss: 6.4866 - val_mean_absolute_error: 2.1015
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 3.3733 - mean_absolute_error: 1.3184 - val_loss: 5.2815 - val_mean_absolute_error: 1.9536
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 3.1186 - mean_absolute_error: 1.3839 - val_loss: 4.6654 - val_mean_absolute_error: 1.8647
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.7195 - mean_absolute_error: 0.9812 - val_loss: 4.7762 - val_mean_absolute_error: 1.8832
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 4.2257 - mean_absolute_error: 1.2359 - val_loss: 5.2653 - val_mean_absolute_error: 1.9524
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.0719 - mean_absolute_error: 0.8572 - val_loss: 5.3112 - val_mean_absolute_error: 1.9557
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.6073 - mean_absolute_error: 1.3784 - val_loss: 4.1738 - val_mean_absolute_error: 1.7793
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.9422 - mean_absolute_error: 0.9535 - val_loss: 3.3144 - val_mean_absolute_error: 1.6128
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.8921 - mean_absolute_error: 0.9905 - val_loss: 2.5171 - val_mean_absolute_error: 1.4053
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.7738 - mean_absolute_error: 1.0862 - val_loss: 2.1995 - val_mean_absolute_error: 1.2938
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.2488 - mean_absolute_error: 0.9566 - val_loss: 2.1719 - val_mean_absolute_error: 1.2849
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.5911 - mean_absolute_error: 1.0257 - val_loss: 2.2095 - val_mean_absolute_error: 1.2987
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 5.4585 - mean_absolute_error: 1.4954 - val_loss: 2.4702 - val_mean_absolute_error: 1.3892
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 4.4563 - mean_absolute_error: 1.1404 - val_loss: 3.0959 - val_mean_absolute_error: 1.5542
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 6.9963 - mean_absolute_error: 1.3464 - val_loss: 4.0090 - val_mean_absolute_error: 1.7311
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.0308 - mean_absolute_error: 1.0354 - val_loss: 4.7086 - val_mean_absolute_error: 1.8457

FOLD 3
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 58ms/step - loss: 8.8750 - mean_absolute_error: 1.8997 - val_loss: 0.6173 - val_mean_absolute_error: 0.6260
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 7.1722 - mean_absolute_error: 1.8133 - val_loss: 0.6275 - val_mean_absolute_error: 0.6504
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.8245 - mean_absolute_error: 1.0403 - val_loss: 0.6361 - val_mean_absolute_error: 0.6171
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 7.8339 - mean_absolute_error: 1.6455 - val_loss: 0.7302 - val_mean_absolute_error: 0.6386
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 1.3468 - mean_absolute_error: 0.8682 - val_loss: 0.7806 - val_mean_absolute_error: 0.5784
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.3525 - mean_absolute_error: 1.0442 - val_loss: 0.9046 - val_mean_absolute_error: 0.6698
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 3.0749 - mean_absolute_error: 1.2012 - val_loss: 1.0648 - val_mean_absolute_error: 0.7791
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.9630 - mean_absolute_error: 0.7765 - val_loss: 1.2187 - val_mean_absolute_error: 0.8708
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 3.2853 - mean_absolute_error: 1.3462 - val_loss: 1.2931 - val_mean_absolute_error: 0.9019
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.2414 - mean_absolute_error: 1.1685 - val_loss: 1.4171 - val_mean_absolute_error: 0.9223
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.8435 - mean_absolute_error: 0.6848 - val_loss: 1.5850 - val_mean_absolute_error: 0.9415
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.3695 - mean_absolute_error: 0.9638 - val_loss: 1.6962 - val_mean_absolute_error: 1.0311
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 9.2679 - mean_absolute_error: 1.8048 - val_loss: 1.5104 - val_mean_absolute_error: 0.9784
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 3.8167 - mean_absolute_error: 1.1462 - val_loss: 1.4260 - val_mean_absolute_error: 0.9824
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.9240 - mean_absolute_error: 0.8034 - val_loss: 1.4433 - val_mean_absolute_error: 0.9618
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 8.1145 - mean_absolute_error: 1.5174 - val_loss: 1.5517 - val_mean_absolute_error: 0.9561
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 2.0943 - mean_absolute_error: 1.0278 - val_loss: 1.6360 - val_mean_absolute_error: 0.9351
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 3.4145 - mean_absolute_error: 1.1760 - val_loss: 1.7024 - val_mean_absolute_error: 0.9383
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 3.8404 - mean_absolute_error: 1.3661 - val_loss: 1.6996 - val_mean_absolute_error: 0.9794
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 4.2246 - mean_absolute_error: 1.2059 - val_loss: 1.6392 - val_mean_absolute_error: 0.9871
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 65ms/step - loss: 3.2393 - mean_absolute_error: 1.2839 - val_loss: 1.5282 - val_mean_absolute_error: 0.9687
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 3.3459 - mean_absolute_error: 1.2284 - val_loss: 1.4526 - val_mean_absolute_error: 0.9611
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.5028 - mean_absolute_error: 1.0075 - val_loss: 1.3672 - val_mean_absolute_error: 0.9406
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.9182 - mean_absolute_error: 0.9581 - val_loss: 1.3641 - val_mean_absolute_error: 0.9586
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 3.0196 - mean_absolute_error: 1.1118 - val_loss: 1.4265 - val_mean_absolute_error: 0.9938
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 3.5173 - mean_absolute_error: 1.2242 - val_loss: 1.5188 - val_mean_absolute_error: 1.0340
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.7834 - mean_absolute_error: 0.8036 - val_loss: 1.6568 - val_mean_absolute_error: 1.0790
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.9905 - mean_absolute_error: 0.7844 - val_loss: 1.7619 - val_mean_absolute_error: 1.1070
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.9220 - mean_absolute_error: 0.8034 - val_loss: 1.9214 - val_mean_absolute_error: 1.1443
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.7879 - mean_absolute_error: 1.2947 - val_loss: 1.9064 - val_mean_absolute_error: 1.1382
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 3.9200 - mean_absolute_error: 1.4622 - val_loss: 1.8864 - val_mean_absolute_error: 1.1195
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.7296 - mean_absolute_error: 1.2039 - val_loss: 1.8394 - val_mean_absolute_error: 1.0961
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.1143 - mean_absolute_error: 0.8384 - val_loss: 1.7638 - val_mean_absolute_error: 1.0607
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.3264 - mean_absolute_error: 0.8294 - val_loss: 1.7129 - val_mean_absolute_error: 1.0183
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.9954 - mean_absolute_error: 0.6626 - val_loss: 1.7214 - val_mean_absolute_error: 0.9782
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 3.3707 - mean_absolute_error: 1.0527 - val_loss: 1.8649 - val_mean_absolute_error: 1.0106
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 8.1246 - mean_absolute_error: 1.2843 - val_loss: 2.0204 - val_mean_absolute_error: 1.0557
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.3374 - mean_absolute_error: 1.1515 - val_loss: 2.1448 - val_mean_absolute_error: 1.0847
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.3518 - mean_absolute_error: 1.0959 - val_loss: 2.2345 - val_mean_absolute_error: 1.1105
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.5082 - mean_absolute_error: 1.0694 - val_loss: 2.1681 - val_mean_absolute_error: 1.0934
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.9311 - mean_absolute_error: 1.1807 - val_loss: 2.0849 - val_mean_absolute_error: 1.0741
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 3.9028 - mean_absolute_error: 1.2399 - val_loss: 1.9615 - val_mean_absolute_error: 1.0893
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 9.4781 - mean_absolute_error: 1.3716 - val_loss: 1.9024 - val_mean_absolute_error: 1.0907
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.2021 - mean_absolute_error: 0.7149 - val_loss: 1.9364 - val_mean_absolute_error: 1.1137
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.7597 - mean_absolute_error: 0.6131 - val_loss: 1.9501 - val_mean_absolute_error: 1.1246
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 4.7017 - mean_absolute_error: 1.2993 - val_loss: 1.9691 - val_mean_absolute_error: 1.1342
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 2.0213 - mean_absolute_error: 0.9354 - val_loss: 1.9769 - val_mean_absolute_error: 1.1401
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 4.4690 - mean_absolute_error: 1.3677 - val_loss: 1.8903 - val_mean_absolute_error: 1.1225
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 1.7961 - mean_absolute_error: 0.8039 - val_loss: 1.7780 - val_mean_absolute_error: 1.0970
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.7299 - mean_absolute_error: 0.6695 - val_loss: 1.6535 - val_mean_absolute_error: 1.0659

FOLD 4
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 59ms/step - loss: 7.0951 - mean_absolute_error: 1.7264 - val_loss: 0.3048 - val_mean_absolute_error: 0.4874
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 2.5795 - mean_absolute_error: 1.1699 - val_loss: 0.4428 - val_mean_absolute_error: 0.5528
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.9391 - mean_absolute_error: 1.0513 - val_loss: 0.5843 - val_mean_absolute_error: 0.6043
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.6624 - mean_absolute_error: 0.9023 - val_loss: 0.7909 - val_mean_absolute_error: 0.6723
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 12ms/step - loss: 0.8810 - mean_absolute_error: 0.7025
2025-08-09 16:35:39.607503: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:35:39.607834: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.8810 - mean_absolute_error: 0.7025 - val_loss: 0.9820 - val_mean_absolute_error: 0.7255
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 1.0383 - mean_absolute_error: 0.7857 - val_loss: 1.1015 - val_mean_absolute_error: 0.7506
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 2.5076 - mean_absolute_error: 0.9947 - val_loss: 1.2203 - val_mean_absolute_error: 0.7701
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 3.0131 - mean_absolute_error: 1.0587 - val_loss: 1.2867 - val_mean_absolute_error: 0.7755
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 9.5092 - mean_absolute_error: 1.5780 - val_loss: 1.3041 - val_mean_absolute_error: 0.7895
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 3.3094 - mean_absolute_error: 1.2591 - val_loss: 1.2783 - val_mean_absolute_error: 0.8250
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 1.0751 - mean_absolute_error: 0.7625 - val_loss: 1.2932 - val_mean_absolute_error: 0.8527
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.7700 - mean_absolute_error: 1.0120 - val_loss: 1.2981 - val_mean_absolute_error: 0.8871
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 5.4257 - mean_absolute_error: 1.4176 - val_loss: 1.2863 - val_mean_absolute_error: 0.9147
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.2865 - mean_absolute_error: 1.2753 - val_loss: 1.3486 - val_mean_absolute_error: 0.9086
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.1343 - mean_absolute_error: 1.1001 - val_loss: 1.3942 - val_mean_absolute_error: 0.9059
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.4360 - mean_absolute_error: 1.1940 - val_loss: 1.4732 - val_mean_absolute_error: 0.8645
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.7014 - mean_absolute_error: 1.0477 - val_loss: 1.5907 - val_mean_absolute_error: 0.8286
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.2043 - mean_absolute_error: 0.6247 - val_loss: 1.7811 - val_mean_absolute_error: 0.8398
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.4049 - mean_absolute_error: 0.9771 - val_loss: 2.2009 - val_mean_absolute_error: 0.8840
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.3796 - mean_absolute_error: 0.7120 - val_loss: 2.4940 - val_mean_absolute_error: 0.9033
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 3.6281 - mean_absolute_error: 1.3027 - val_loss: 2.6730 - val_mean_absolute_error: 0.9436
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.5343 - mean_absolute_error: 0.9678 - val_loss: 2.7106 - val_mean_absolute_error: 0.9495
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.6877 - mean_absolute_error: 0.9318 - val_loss: 2.6766 - val_mean_absolute_error: 0.9693
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.7163 - mean_absolute_error: 0.9484 - val_loss: 2.5516 - val_mean_absolute_error: 0.9673
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.7917 - mean_absolute_error: 1.1068 - val_loss: 2.4885 - val_mean_absolute_error: 0.9742
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.5678 - mean_absolute_error: 0.9043 - val_loss: 2.4938 - val_mean_absolute_error: 0.9837
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.1676 - mean_absolute_error: 0.7949 - val_loss: 2.5276 - val_mean_absolute_error: 0.9887
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 4.3125 - mean_absolute_error: 1.2418 - val_loss: 2.4546 - val_mean_absolute_error: 0.9844
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 2.3668 - mean_absolute_error: 1.2984 - val_loss: 2.3996 - val_mean_absolute_error: 0.9812
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.8365 - mean_absolute_error: 0.9547 - val_loss: 2.3382 - val_mean_absolute_error: 0.9940
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.7484 - mean_absolute_error: 0.9208 - val_loss: 2.2356 - val_mean_absolute_error: 0.9996
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 3.1211 - mean_absolute_error: 1.0797 - val_loss: 2.1309 - val_mean_absolute_error: 1.0153
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.9225 - mean_absolute_error: 0.7890 - val_loss: 2.0001 - val_mean_absolute_error: 1.0158
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.3004 - mean_absolute_error: 0.8848 - val_loss: 1.9381 - val_mean_absolute_error: 1.0227
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.7085 - mean_absolute_error: 1.1019 - val_loss: 1.8105 - val_mean_absolute_error: 0.9855
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.1144 - mean_absolute_error: 0.9051 - val_loss: 1.8038 - val_mean_absolute_error: 0.9664
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.5026 - mean_absolute_error: 0.8943 - val_loss: 1.9423 - val_mean_absolute_error: 0.9346
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 2.0243 - mean_absolute_error: 1.1167 - val_loss: 2.2380 - val_mean_absolute_error: 0.9268
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.2568 - mean_absolute_error: 0.7271 - val_loss: 2.7313 - val_mean_absolute_error: 0.9740
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - loss: 4.6643 - mean_absolute_error: 1.6047 - val_loss: 3.5779 - val_mean_absolute_error: 1.0614
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 3.5148 - mean_absolute_error: 0.9943 - val_loss: 4.0868 - val_mean_absolute_error: 1.1357
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.3209 - mean_absolute_error: 1.0153 - val_loss: 4.5116 - val_mean_absolute_error: 1.1844
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 4.1899 - mean_absolute_error: 1.2318 - val_loss: 4.8021 - val_mean_absolute_error: 1.2208
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.0507 - mean_absolute_error: 1.1468 - val_loss: 5.2001 - val_mean_absolute_error: 1.2743
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.8628 - mean_absolute_error: 1.0754 - val_loss: 5.6639 - val_mean_absolute_error: 1.3449
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.2724 - mean_absolute_error: 0.9396 - val_loss: 6.0911 - val_mean_absolute_error: 1.3853
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.4366 - mean_absolute_error: 1.1955 - val_loss: 6.1706 - val_mean_absolute_error: 1.3649
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 3.2052 - mean_absolute_error: 1.2084 - val_loss: 5.8576 - val_mean_absolute_error: 1.3762
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 4.4757 - mean_absolute_error: 1.2793 - val_loss: 5.6085 - val_mean_absolute_error: 1.3996
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 2.6040 - mean_absolute_error: 1.2780 - val_loss: 5.5252 - val_mean_absolute_error: 1.4303

FOLD 5
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 59ms/step - loss: 1.7414 - mean_absolute_error: 0.9562 - val_loss: 0.4978 - val_mean_absolute_error: 0.5585
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 4.4599 - mean_absolute_error: 1.4492 - val_loss: 0.6276 - val_mean_absolute_error: 0.5604
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.1823 - mean_absolute_error: 1.0989 - val_loss: 0.7379 - val_mean_absolute_error: 0.6253
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 3.1236 - mean_absolute_error: 1.3181 - val_loss: 0.8650 - val_mean_absolute_error: 0.6979
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.2789 - mean_absolute_error: 0.8968 - val_loss: 1.0238 - val_mean_absolute_error: 0.7789
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 2.5167 - mean_absolute_error: 1.0973 - val_loss: 1.1059 - val_mean_absolute_error: 0.8336
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.4474 - mean_absolute_error: 1.0567 - val_loss: 1.0286 - val_mean_absolute_error: 0.8231
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 2.3268 - mean_absolute_error: 1.0952 - val_loss: 0.7898 - val_mean_absolute_error: 0.7446
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 9.9348 - mean_absolute_error: 1.8347 - val_loss: 0.3649 - val_mean_absolute_error: 0.5044
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 4.6029 - mean_absolute_error: 1.1645 - val_loss: 0.2878 - val_mean_absolute_error: 0.4745
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 2.0315 - mean_absolute_error: 0.9744 - val_loss: 0.3612 - val_mean_absolute_error: 0.5230
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 4.6496 - mean_absolute_error: 1.4520 - val_loss: 0.5235 - val_mean_absolute_error: 0.6691
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 4.6224 - mean_absolute_error: 1.4156 - val_loss: 0.7317 - val_mean_absolute_error: 0.8163
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 4.8975 - mean_absolute_error: 1.4217 - val_loss: 1.0885 - val_mean_absolute_error: 0.9679
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 5.2098 - mean_absolute_error: 1.4595 - val_loss: 1.4179 - val_mean_absolute_error: 1.0633
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 3.9059 - mean_absolute_error: 1.3508 - val_loss: 1.6371 - val_mean_absolute_error: 1.1106
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 4.8264 - mean_absolute_error: 1.4267 - val_loss: 1.8046 - val_mean_absolute_error: 1.1405
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 4.3806 - mean_absolute_error: 1.4835 - val_loss: 1.8490 - val_mean_absolute_error: 1.1409
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.7675 - mean_absolute_error: 1.2739 - val_loss: 1.7693 - val_mean_absolute_error: 1.1139
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 4.3997 - mean_absolute_error: 1.3771 - val_loss: 1.5866 - val_mean_absolute_error: 1.0605
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 3.9134 - mean_absolute_error: 1.3528 - val_loss: 1.3366 - val_mean_absolute_error: 0.9858
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 2.8549 - mean_absolute_error: 1.0537 - val_loss: 1.0719 - val_mean_absolute_error: 0.8956
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 1.2847 - mean_absolute_error: 0.8346 - val_loss: 0.8102 - val_mean_absolute_error: 0.7959
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.3243 - mean_absolute_error: 0.9797 - val_loss: 0.5449 - val_mean_absolute_error: 0.6716
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 0.6260 - mean_absolute_error: 0.6302 - val_loss: 0.3464 - val_mean_absolute_error: 0.5370
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 2.6824 - mean_absolute_error: 1.1527 - val_loss: 0.2252 - val_mean_absolute_error: 0.3810
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 3.6152 - mean_absolute_error: 0.8627 - val_loss: 0.2323 - val_mean_absolute_error: 0.4025
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 3.7182 - mean_absolute_error: 0.8987 - val_loss: 0.3155 - val_mean_absolute_error: 0.4953
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 1.3642 - mean_absolute_error: 0.7043 - val_loss: 0.4290 - val_mean_absolute_error: 0.5613
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.6996 - mean_absolute_error: 1.1134 - val_loss: 0.5023 - val_mean_absolute_error: 0.5846
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 3.7369 - mean_absolute_error: 1.3719 - val_loss: 0.0980 - val_mean_absolute_error: 0.2639
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 113ms/step - loss: 1.9397 - mean_absolute_error: 1.0386 - val_loss: 0.0819 - val_mean_absolute_error: 0.2343
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.7013 - mean_absolute_error: 0.6563 - val_loss: 0.2479 - val_mean_absolute_error: 0.3656
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 4.6440 - mean_absolute_error: 1.4016 - val_loss: 0.4403 - val_mean_absolute_error: 0.4759
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.8025 - mean_absolute_error: 1.1267 - val_loss: 0.5825 - val_mean_absolute_error: 0.5438
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.1560 - mean_absolute_error: 0.7990 - val_loss: 0.6401 - val_mean_absolute_error: 0.5758
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.6856 - mean_absolute_error: 0.9831 - val_loss: 0.6170 - val_mean_absolute_error: 0.5823
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 3.2938 - mean_absolute_error: 1.0451 - val_loss: 0.5071 - val_mean_absolute_error: 0.5525
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.2136 - mean_absolute_error: 1.0017 - val_loss: 0.3839 - val_mean_absolute_error: 0.5228
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 3.0283 - mean_absolute_error: 1.2367 - val_loss: 0.2781 - val_mean_absolute_error: 0.4818
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.2004 - mean_absolute_error: 0.9376 - val_loss: 0.2092 - val_mean_absolute_error: 0.4084
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.9180 - mean_absolute_error: 0.7356 - val_loss: 0.2223 - val_mean_absolute_error: 0.3394
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.5174 - mean_absolute_error: 0.9299 - val_loss: 0.3090 - val_mean_absolute_error: 0.4623
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 5.8347 - mean_absolute_error: 1.3076 - val_loss: 0.3889 - val_mean_absolute_error: 0.5318
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 3.9284 - mean_absolute_error: 1.0760 - val_loss: 0.3974 - val_mean_absolute_error: 0.5270
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 4.1321 - mean_absolute_error: 0.9210 - val_loss: 0.3905 - val_mean_absolute_error: 0.5118
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 7.4983 - mean_absolute_error: 1.3907 - val_loss: 0.3371 - val_mean_absolute_error: 0.4186
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 5.9513 - mean_absolute_error: 1.4663 - val_loss: 0.3014 - val_mean_absolute_error: 0.4548
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 5.3608 - mean_absolute_error: 1.2124 - val_loss: 0.4134 - val_mean_absolute_error: 0.5715
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.7554 - mean_absolute_error: 1.1288 - val_loss: 0.5721 - val_mean_absolute_error: 0.6205
Validation losses: [38.46677017211914, 4.708645820617676, 1.6534618139266968, 5.525229454040527, 0.572098433971405]
HPS: {'player_emb_dim': 32, 'dense_units': 128, 'dense_units_2': 16, 'learning_rate': 0.0001, 'dropout_rate': 0.2, 'dropout_rate_2': 0.2, 'dropout_rate_inter': 0.1, 'interaction_scale': 4}. MSE during RandomSearch: 2.7987241744995117. Starting evaluation across all k folds...

FOLD 1
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 2s 2s/step - loss: 3.8486 - mean_absolute_error: 1.5385 - val_loss: 39.4255 - val_mean_absolute_error: 5.0077
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 4.2466 - mean_absolute_error: 1.6378 - val_loss: 39.3468 - val_mean_absolute_error: 5.0032
Epoch 3/50
2025-08-09 16:35:45.615266: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:35:45.615604: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 3.9875 - mean_absolute_error: 1.5926 - val_loss: 39.2809 - val_mean_absolute_error: 4.9996
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 3.9030 - mean_absolute_error: 1.5536 - val_loss: 39.1958 - val_mean_absolute_error: 4.9952
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 3.9614 - mean_absolute_error: 1.5799 - val_loss: 39.1213 - val_mean_absolute_error: 4.9918
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.9701 - mean_absolute_error: 1.5910 - val_loss: 39.0485 - val_mean_absolute_error: 4.9887
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 4.0188 - mean_absolute_error: 1.6213 - val_loss: 38.9686 - val_mean_absolute_error: 4.9850
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.4985 - mean_absolute_error: 1.4712 - val_loss: 38.8901 - val_mean_absolute_error: 4.9816
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 3.9536 - mean_absolute_error: 1.5624 - val_loss: 38.8072 - val_mean_absolute_error: 4.9775
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 3.7074 - mean_absolute_error: 1.5514 - val_loss: 38.7313 - val_mean_absolute_error: 4.9741
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 3.5352 - mean_absolute_error: 1.4932 - val_loss: 38.6563 - val_mean_absolute_error: 4.9703
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 3.1479 - mean_absolute_error: 1.4401 - val_loss: 38.5793 - val_mean_absolute_error: 4.9665
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.5576 - mean_absolute_error: 1.5215 - val_loss: 38.5174 - val_mean_absolute_error: 4.9633
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 3.5926 - mean_absolute_error: 1.5231 - val_loss: 38.4615 - val_mean_absolute_error: 4.9602
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.3799 - mean_absolute_error: 1.4750 - val_loss: 38.3960 - val_mean_absolute_error: 4.9570
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 3.1843 - mean_absolute_error: 1.3520 - val_loss: 38.3329 - val_mean_absolute_error: 4.9540
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.7734 - mean_absolute_error: 1.5179 - val_loss: 38.2733 - val_mean_absolute_error: 4.9513
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 3.3636 - mean_absolute_error: 1.4837 - val_loss: 38.2155 - val_mean_absolute_error: 4.9486
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 4.2112 - mean_absolute_error: 1.5455 - val_loss: 38.1561 - val_mean_absolute_error: 4.9459
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 3.5863 - mean_absolute_error: 1.4988 - val_loss: 38.0973 - val_mean_absolute_error: 4.9433
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 3.1740 - mean_absolute_error: 1.3180 - val_loss: 38.0400 - val_mean_absolute_error: 4.9408
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 3.0102 - mean_absolute_error: 1.3500 - val_loss: 37.9815 - val_mean_absolute_error: 4.9381
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 3.0399 - mean_absolute_error: 1.3330 - val_loss: 37.9305 - val_mean_absolute_error: 4.9360
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 2.9759 - mean_absolute_error: 1.2547 - val_loss: 37.8841 - val_mean_absolute_error: 4.9341
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 3.1027 - mean_absolute_error: 1.3777 - val_loss: 37.8386 - val_mean_absolute_error: 4.9322
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 2.6912 - mean_absolute_error: 1.1608 - val_loss: 37.7912 - val_mean_absolute_error: 4.9299
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 3.0866 - mean_absolute_error: 1.3305 - val_loss: 37.7366 - val_mean_absolute_error: 4.9271
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 94ms/step - loss: 2.6096 - mean_absolute_error: 1.2557 - val_loss: 37.6826 - val_mean_absolute_error: 4.9241
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 52ms/step - loss: 3.1244 - mean_absolute_error: 1.3968 - val_loss: 37.6304 - val_mean_absolute_error: 4.9215
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.6298 - mean_absolute_error: 1.2941 - val_loss: 37.5778 - val_mean_absolute_error: 4.9187
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 2.8674 - mean_absolute_error: 1.2951 - val_loss: 37.5281 - val_mean_absolute_error: 4.9164
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 2.7585 - mean_absolute_error: 1.2732 - val_loss: 37.4768 - val_mean_absolute_error: 4.9140
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 2.7558 - mean_absolute_error: 1.1899 - val_loss: 37.4217 - val_mean_absolute_error: 4.9113
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.6005 - mean_absolute_error: 1.2539 - val_loss: 37.3627 - val_mean_absolute_error: 4.9085
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.8642 - mean_absolute_error: 1.3751 - val_loss: 37.3029 - val_mean_absolute_error: 4.9055
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 2.7039 - mean_absolute_error: 1.2013 - val_loss: 37.2459 - val_mean_absolute_error: 4.9027
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 2.3767 - mean_absolute_error: 1.1616 - val_loss: 37.1969 - val_mean_absolute_error: 4.9001
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.4485 - mean_absolute_error: 1.2555 - val_loss: 37.1490 - val_mean_absolute_error: 4.8977
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.2171 - mean_absolute_error: 1.1547 - val_loss: 37.0944 - val_mean_absolute_error: 4.8946
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.2850 - mean_absolute_error: 1.1778 - val_loss: 37.0432 - val_mean_absolute_error: 4.8917
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 2.0056 - mean_absolute_error: 0.9992 - val_loss: 36.9831 - val_mean_absolute_error: 4.8883
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.8117 - mean_absolute_error: 1.1949 - val_loss: 36.9247 - val_mean_absolute_error: 4.8849
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.8915 - mean_absolute_error: 1.2780 - val_loss: 36.8628 - val_mean_absolute_error: 4.8812
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 2.1475 - mean_absolute_error: 1.0993 - val_loss: 36.8030 - val_mean_absolute_error: 4.8778
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.5528 - mean_absolute_error: 1.2333 - val_loss: 36.7449 - val_mean_absolute_error: 4.8743
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.5286 - mean_absolute_error: 1.1820 - val_loss: 36.6866 - val_mean_absolute_error: 4.8708
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 2.1953 - mean_absolute_error: 1.1019 - val_loss: 36.6311 - val_mean_absolute_error: 4.8672
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 2.2881 - mean_absolute_error: 1.0785 - val_loss: 36.5766 - val_mean_absolute_error: 4.8636
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.0404 - mean_absolute_error: 1.0589 - val_loss: 36.5214 - val_mean_absolute_error: 4.8599
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.0142 - mean_absolute_error: 1.0693 - val_loss: 36.4707 - val_mean_absolute_error: 4.8565

FOLD 2
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 65ms/step - loss: 11.7390 - mean_absolute_error: 2.1366 - val_loss: 1.5926 - val_mean_absolute_error: 0.9422
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 12.5491 - mean_absolute_error: 2.1401 - val_loss: 1.5752 - val_mean_absolute_error: 0.9386
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 10.1243 - mean_absolute_error: 2.0623 - val_loss: 1.5577 - val_mean_absolute_error: 0.9359
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 9.9589 - mean_absolute_error: 1.9669 - val_loss: 1.5383 - val_mean_absolute_error: 0.9327
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 12.2452 - mean_absolute_error: 2.2294 - val_loss: 1.5193 - val_mean_absolute_error: 0.9301
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 11.2663 - mean_absolute_error: 2.1204 - val_loss: 1.5002 - val_mean_absolute_error: 0.9280
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 10.3940 - mean_absolute_error: 1.9183 - val_loss: 1.4814 - val_mean_absolute_error: 0.9267
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 154ms/step - loss: 11.0075 - mean_absolute_error: 2.1181 - val_loss: 1.4639 - val_mean_absolute_error: 0.9259
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 82ms/step - loss: 10.6338 - mean_absolute_error: 2.0738 - val_loss: 1.4458 - val_mean_absolute_error: 0.9244
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 10.1505 - mean_absolute_error: 1.9907 - val_loss: 1.4290 - val_mean_absolute_error: 0.9233
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 159ms/step - loss: 10.9588 - mean_absolute_error: 2.1197 - val_loss: 1.4122 - val_mean_absolute_error: 0.9224
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 10.7521 - mean_absolute_error: 2.2361 - val_loss: 1.3972 - val_mean_absolute_error: 0.9227
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 9.5938 - mean_absolute_error: 1.8899 - val_loss: 1.3837 - val_mean_absolute_error: 0.9244
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 11.7527 - mean_absolute_error: 2.2050 - val_loss: 1.3704 - val_mean_absolute_error: 0.9259
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 9.3662 - mean_absolute_error: 1.9299 - val_loss: 1.3568 - val_mean_absolute_error: 0.9276
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 11.4671 - mean_absolute_error: 2.0198 - val_loss: 1.3455 - val_mean_absolute_error: 0.9300
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 9.5156 - mean_absolute_error: 1.9514 - val_loss: 1.3339 - val_mean_absolute_error: 0.9323
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 8.4281 - mean_absolute_error: 1.8036 - val_loss: 1.3229 - val_mean_absolute_error: 0.9350
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 11.2983 - mean_absolute_error: 2.0591 - val_loss: 1.3138 - val_mean_absolute_error: 0.9378
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 11.6364 - mean_absolute_error: 1.9764 - val_loss: 1.3075 - val_mean_absolute_error: 0.9416
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 8.4496 - mean_absolute_error: 1.8208 - val_loss: 1.3002 - val_mean_absolute_error: 0.9449
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 9.3153 - mean_absolute_error: 2.0032 - val_loss: 1.2919 - val_mean_absolute_error: 0.9479
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 8.9160 - mean_absolute_error: 1.9741 - val_loss: 1.2811 - val_mean_absolute_error: 0.9484
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 8.1909 - mean_absolute_error: 1.8429 - val_loss: 1.2678 - val_mean_absolute_error: 0.9479
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 9.5026 - mean_absolute_error: 1.9100 - val_loss: 1.2537 - val_mean_absolute_error: 0.9471
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 9.3778 - mean_absolute_error: 1.8785 - val_loss: 1.2401 - val_mean_absolute_error: 0.9466
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 10.4502 - mean_absolute_error: 2.0551 - val_loss: 1.2268 - val_mean_absolute_error: 0.9458
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 9.3511 - mean_absolute_error: 1.9812 - val_loss: 1.2146 - val_mean_absolute_error: 0.9451
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 10.3673 - mean_absolute_error: 1.8869 - val_loss: 1.2043 - val_mean_absolute_error: 0.9451
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 9.0023 - mean_absolute_error: 1.8823 - val_loss: 1.1929 - val_mean_absolute_error: 0.9442
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 7.6032 - mean_absolute_error: 1.8337 - val_loss: 1.1797 - val_mean_absolute_error: 0.9425
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 8.1404 - mean_absolute_error: 1.8755 - val_loss: 1.1671 - val_mean_absolute_error: 0.9417
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 8.5347 - mean_absolute_error: 1.7553 - val_loss: 1.1538 - val_mean_absolute_error: 0.9406
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 7.1802 - mean_absolute_error: 1.6603 - val_loss: 1.1394 - val_mean_absolute_error: 0.9385
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 7.9864 - mean_absolute_error: 1.7535 - val_loss: 1.1253 - val_mean_absolute_error: 0.9367
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step - loss: 8.0010 - mean_absolute_error: 1.7839 - val_loss: 1.1128 - val_mean_absolute_error: 0.9359
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 7.8747 - mean_absolute_error: 1.7542 - val_loss: 1.0984 - val_mean_absolute_error: 0.9343
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 8.0070 - mean_absolute_error: 1.7921 - val_loss: 1.0840 - val_mean_absolute_error: 0.9325
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 8.4983 - mean_absolute_error: 1.8972 - val_loss: 1.0700 - val_mean_absolute_error: 0.9300
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 8.0872 - mean_absolute_error: 1.9368 - val_loss: 1.0562 - val_mean_absolute_error: 0.9277
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 5.9540 - mean_absolute_error: 1.5365 - val_loss: 1.0425 - val_mean_absolute_error: 0.9259
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 10.4332 - mean_absolute_error: 1.9677 - val_loss: 1.0301 - val_mean_absolute_error: 0.9239
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 9.0280 - mean_absolute_error: 2.0384 - val_loss: 1.0178 - val_mean_absolute_error: 0.9219
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 8.7361 - mean_absolute_error: 1.9990 - val_loss: 1.0063 - val_mean_absolute_error: 0.9199
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 7.2622 - mean_absolute_error: 1.8240 - val_loss: 0.9946 - val_mean_absolute_error: 0.9178
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 8.4720 - mean_absolute_error: 1.9676 - val_loss: 0.9841 - val_mean_absolute_error: 0.9161
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 6.8657 - mean_absolute_error: 1.7820 - val_loss: 0.9738 - val_mean_absolute_error: 0.9144
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 119ms/step - loss: 8.2419 - mean_absolute_error: 1.8483 - val_loss: 0.9645 - val_mean_absolute_error: 0.9132
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 65ms/step - loss: 6.6864 - mean_absolute_error: 1.7279 - val_loss: 0.9546 - val_mean_absolute_error: 0.9116
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 52ms/step - loss: 6.6137 - mean_absolute_error: 1.7502 - val_loss: 0.9442 - val_mean_absolute_error: 0.9097

FOLD 3
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 75ms/step - loss: 8.3704 - mean_absolute_error: 1.8488 - val_loss: 2.2195 - val_mean_absolute_error: 1.2409
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 8.0708 - mean_absolute_error: 1.8496 - val_loss: 2.2214 - val_mean_absolute_error: 1.2428
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 6.6491 - mean_absolute_error: 1.5589 - val_loss: 2.2301 - val_mean_absolute_error: 1.2463
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 6.9884 - mean_absolute_error: 1.5919 - val_loss: 2.2465 - val_mean_absolute_error: 1.2516
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 6.6936 - mean_absolute_error: 1.6456 - val_loss: 2.2685 - val_mean_absolute_error: 1.2580
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 6.2907 - mean_absolute_error: 1.3630 - val_loss: 2.2979 - val_mean_absolute_error: 1.2660
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 8.7524 - mean_absolute_error: 1.7966 - val_loss: 2.3294 - val_mean_absolute_error: 1.2743
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 8.3808 - mean_absolute_error: 1.8553 - val_loss: 2.3623 - val_mean_absolute_error: 1.2829
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 6.0410 - mean_absolute_error: 1.4238 - val_loss: 2.3991 - val_mean_absolute_error: 1.2921
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 7.1251 - mean_absolute_error: 1.6384 - val_loss: 2.4356 - val_mean_absolute_error: 1.3014
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 8.5781 - mean_absolute_error: 1.7924 - val_loss: 2.4721 - val_mean_absolute_error: 1.3103
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 6.4658 - mean_absolute_error: 1.5625 - val_loss: 2.5098 - val_mean_absolute_error: 1.3191
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 5.4374 - mean_absolute_error: 1.4860 - val_loss: 2.5451 - val_mean_absolute_error: 1.3275
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 5.4526 - mean_absolute_error: 1.5905 - val_loss: 2.5825 - val_mean_absolute_error: 1.3360
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 5.1782 - mean_absolute_error: 1.3995 - val_loss: 2.6282 - val_mean_absolute_error: 1.3459
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 5.4879 - mean_absolute_error: 1.6556 - val_loss: 2.6722 - val_mean_absolute_error: 1.3556
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 9.3496 - mean_absolute_error: 1.6999 - val_loss: 2.7165 - val_mean_absolute_error: 1.3653
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 6.8145 - mean_absolute_error: 1.6052 - val_loss: 2.7571 - val_mean_absolute_error: 1.3740
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 4.3108 - mean_absolute_error: 1.3221 - val_loss: 2.8004 - val_mean_absolute_error: 1.3834
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 5.7626 - mean_absolute_error: 1.6200 - val_loss: 2.8479 - val_mean_absolute_error: 1.3934
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 4.3716 - mean_absolute_error: 1.2992 - val_loss: 2.8986 - val_mean_absolute_error: 1.4038
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 8.5999 - mean_absolute_error: 1.7910 - val_loss: 2.9522 - val_mean_absolute_error: 1.4143
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 8.3156 - mean_absolute_error: 1.8049 - val_loss: 3.0007 - val_mean_absolute_error: 1.4238
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 4.7692 - mean_absolute_error: 1.4101 - val_loss: 3.0509 - val_mean_absolute_error: 1.4337
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 5.5984 - mean_absolute_error: 1.6899 - val_loss: 3.0944 - val_mean_absolute_error: 1.4422
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 5.4533 - mean_absolute_error: 1.5889 - val_loss: 3.1343 - val_mean_absolute_error: 1.4501
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 5.8778 - mean_absolute_error: 1.4443 - val_loss: 3.1818 - val_mean_absolute_error: 1.4594
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 4.5283 - mean_absolute_error: 1.4050 - val_loss: 3.2305 - val_mean_absolute_error: 1.4687
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 4.4008 - mean_absolute_error: 1.5347 - val_loss: 3.2838 - val_mean_absolute_error: 1.4789
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 4.8140 - mean_absolute_error: 1.4564 - val_loss: 3.3414 - val_mean_absolute_error: 1.4896
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 6.4654 - mean_absolute_error: 1.5401 - val_loss: 3.3982 - val_mean_absolute_error: 1.4997
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 4.9388 - mean_absolute_error: 1.5745 - val_loss: 3.4563 - val_mean_absolute_error: 1.5100
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 6.3317 - mean_absolute_error: 1.4796 - val_loss: 3.5162 - val_mean_absolute_error: 1.5204
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 6.0408 - mean_absolute_error: 1.6080 - val_loss: 3.5707 - val_mean_absolute_error: 1.5299
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 5.5798 - mean_absolute_error: 1.6043 - val_loss: 3.6216 - val_mean_absolute_error: 1.5388
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 7.1924 - mean_absolute_error: 1.8696 - val_loss: 3.6713 - val_mean_absolute_error: 1.5473
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 5.8398 - mean_absolute_error: 1.4211 - val_loss: 3.7149 - val_mean_absolute_error: 1.5549
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - loss: 3.8278 - mean_absolute_error: 1.3667 - val_loss: 3.7595 - val_mean_absolute_error: 1.5627
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 9.1378 - mean_absolute_error: 1.6978 - val_loss: 3.7950 - val_mean_absolute_error: 1.5691
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 5.1755 - mean_absolute_error: 1.7005 - val_loss: 3.8339 - val_mean_absolute_error: 1.5757
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 4.4073 - mean_absolute_error: 1.3459 - val_loss: 3.8772 - val_mean_absolute_error: 1.5829
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 6.1342 - mean_absolute_error: 1.5035 - val_loss: 3.9135 - val_mean_absolute_error: 1.5887
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 5.7402 - mean_absolute_error: 1.6517 - val_loss: 3.9596 - val_mean_absolute_error: 1.5962
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 7.2549 - mean_absolute_error: 1.6337 - val_loss: 4.0089 - val_mean_absolute_error: 1.6036
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 4.1045 - mean_absolute_error: 1.4543 - val_loss: 4.0525 - val_mean_absolute_error: 1.6101
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 7.1107 - mean_absolute_error: 1.8082 - val_loss: 4.1000 - val_mean_absolute_error: 1.6171
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 4.9112 - mean_absolute_error: 1.4772 - val_loss: 4.1447 - val_mean_absolute_error: 1.6238
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 3.2295 - mean_absolute_error: 1.2632 - val_loss: 4.1860 - val_mean_absolute_error: 1.6298
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 6.4669 - mean_absolute_error: 1.6092 - val_loss: 4.2289 - val_mean_absolute_error: 1.6358
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 91ms/step - loss: 6.8557 - mean_absolute_error: 1.7330 - val_loss: 4.2688 - val_mean_absolute_error: 1.6411

FOLD 4
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 67ms/step - loss: 6.5534 - mean_absolute_error: 1.6813 - val_loss: 1.5416 - val_mean_absolute_error: 1.1031
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 5.1985 - mean_absolute_error: 1.5122 - val_loss: 1.5354 - val_mean_absolute_error: 1.0994
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 4.5762 - mean_absolute_error: 1.6217 - val_loss: 1.5275 - val_mean_absolute_error: 1.0940
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 10.1966 - mean_absolute_error: 1.8728 - val_loss: 1.5166 - val_mean_absolute_error: 1.0852
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 4.4612 - mean_absolute_error: 1.3928
2025-08-09 16:35:52.225866: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:35:52.226602: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 4.4612 - mean_absolute_error: 1.3928 - val_loss: 1.5069 - val_mean_absolute_error: 1.0762
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 7.1197 - mean_absolute_error: 1.6960 - val_loss: 1.4944 - val_mean_absolute_error: 1.0639
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 4.2804 - mean_absolute_error: 1.5087 - val_loss: 1.4838 - val_mean_absolute_error: 1.0519
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 3.9268 - mean_absolute_error: 1.2639 - val_loss: 1.4765 - val_mean_absolute_error: 1.0410
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 34ms/step - loss: 5.0845 - mean_absolute_error: 1.4534 - val_loss: 1.4741 - val_mean_absolute_error: 1.0314
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 4.7393 - mean_absolute_error: 1.5215 - val_loss: 1.4738 - val_mean_absolute_error: 1.0224
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 3.4503 - mean_absolute_error: 1.3704 - val_loss: 1.4752 - val_mean_absolute_error: 1.0139
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 4.0369 - mean_absolute_error: 1.3755 - val_loss: 1.4765 - val_mean_absolute_error: 1.0035
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.5350 - mean_absolute_error: 1.2021 - val_loss: 1.4804 - val_mean_absolute_error: 0.9957
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 4.8649 - mean_absolute_error: 1.4102 - val_loss: 1.4864 - val_mean_absolute_error: 0.9892
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 8.3773 - mean_absolute_error: 1.8072 - val_loss: 1.4980 - val_mean_absolute_error: 0.9874
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 5.5934 - mean_absolute_error: 1.6183 - val_loss: 1.5096 - val_mean_absolute_error: 0.9858
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 3.3825 - mean_absolute_error: 1.4172 - val_loss: 1.5213 - val_mean_absolute_error: 0.9850
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 2.9101 - mean_absolute_error: 1.3464 - val_loss: 1.5349 - val_mean_absolute_error: 0.9858
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 5.1923 - mean_absolute_error: 1.6328 - val_loss: 1.5480 - val_mean_absolute_error: 0.9867
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 3.3847 - mean_absolute_error: 1.2747 - val_loss: 1.5609 - val_mean_absolute_error: 0.9880
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.7390 - mean_absolute_error: 1.1776 - val_loss: 1.5748 - val_mean_absolute_error: 0.9901
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 8.5997 - mean_absolute_error: 1.7722 - val_loss: 1.5864 - val_mean_absolute_error: 0.9918
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 2.8832 - mean_absolute_error: 1.1508 - val_loss: 1.6004 - val_mean_absolute_error: 0.9946
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 3.2316 - mean_absolute_error: 1.2775 - val_loss: 1.6158 - val_mean_absolute_error: 0.9982
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 35ms/step - loss: 3.9295 - mean_absolute_error: 1.4152 - val_loss: 1.6337 - val_mean_absolute_error: 1.0031
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 5.7151 - mean_absolute_error: 1.5362 - val_loss: 1.6523 - val_mean_absolute_error: 1.0076
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 4.7248 - mean_absolute_error: 1.3725 - val_loss: 1.6705 - val_mean_absolute_error: 1.0111
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 2.2514 - mean_absolute_error: 1.0537 - val_loss: 1.6899 - val_mean_absolute_error: 1.0161
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 3.1055 - mean_absolute_error: 1.2645 - val_loss: 1.7092 - val_mean_absolute_error: 1.0201
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 2.6450 - mean_absolute_error: 1.0868 - val_loss: 1.7285 - val_mean_absolute_error: 1.0237
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 5.5927 - mean_absolute_error: 1.6358 - val_loss: 1.7501 - val_mean_absolute_error: 1.0298
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 4.1992 - mean_absolute_error: 1.3184 - val_loss: 1.7712 - val_mean_absolute_error: 1.0369
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 2.7530 - mean_absolute_error: 1.2576 - val_loss: 1.7913 - val_mean_absolute_error: 1.0444
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.5661 - mean_absolute_error: 0.8960 - val_loss: 1.8108 - val_mean_absolute_error: 1.0522
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 3.8561 - mean_absolute_error: 1.0632 - val_loss: 1.8305 - val_mean_absolute_error: 1.0589
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 6.4772 - mean_absolute_error: 1.5734 - val_loss: 1.8511 - val_mean_absolute_error: 1.0666
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 7.9123 - mean_absolute_error: 1.7588 - val_loss: 1.8710 - val_mean_absolute_error: 1.0758
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 5.8333 - mean_absolute_error: 1.5266 - val_loss: 1.8913 - val_mean_absolute_error: 1.0844
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.8925 - mean_absolute_error: 1.0653 - val_loss: 1.9119 - val_mean_absolute_error: 1.0945
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 5.2891 - mean_absolute_error: 1.3673 - val_loss: 1.9330 - val_mean_absolute_error: 1.1044
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 3.6658 - mean_absolute_error: 1.4854 - val_loss: 1.9545 - val_mean_absolute_error: 1.1136
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 4.8424 - mean_absolute_error: 1.4907 - val_loss: 1.9789 - val_mean_absolute_error: 1.1252
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 7.3279 - mean_absolute_error: 1.7232 - val_loss: 2.0022 - val_mean_absolute_error: 1.1359
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 6.7043 - mean_absolute_error: 1.5244 - val_loss: 2.0237 - val_mean_absolute_error: 1.1448
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 6.9384 - mean_absolute_error: 1.6203 - val_loss: 2.0435 - val_mean_absolute_error: 1.1526
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.1636 - mean_absolute_error: 1.1758 - val_loss: 2.0605 - val_mean_absolute_error: 1.1569
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.1570 - mean_absolute_error: 1.1009 - val_loss: 2.0760 - val_mean_absolute_error: 1.1594
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.1669 - mean_absolute_error: 1.0425 - val_loss: 2.0911 - val_mean_absolute_error: 1.1615
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 4.2179 - mean_absolute_error: 1.2841 - val_loss: 2.1064 - val_mean_absolute_error: 1.1634
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 2.0995 - mean_absolute_error: 1.0686 - val_loss: 2.1218 - val_mean_absolute_error: 1.1642

FOLD 5
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 63ms/step - loss: 3.2103 - mean_absolute_error: 1.4254 - val_loss: 0.0331 - val_mean_absolute_error: 0.1349
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 3.4924 - mean_absolute_error: 1.1857 - val_loss: 0.0339 - val_mean_absolute_error: 0.1372
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 2.8603 - mean_absolute_error: 1.3486 - val_loss: 0.0354 - val_mean_absolute_error: 0.1418
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 6.0090 - mean_absolute_error: 1.3999 - val_loss: 0.0368 - val_mean_absolute_error: 0.1456
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 4.9343 - mean_absolute_error: 1.5288 - val_loss: 0.0389 - val_mean_absolute_error: 0.1511
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 2.9123 - mean_absolute_error: 1.2346 - val_loss: 0.0413 - val_mean_absolute_error: 0.1574
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 3.8305 - mean_absolute_error: 1.2784 - val_loss: 0.0442 - val_mean_absolute_error: 0.1648
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 3.1937 - mean_absolute_error: 1.3698 - val_loss: 0.0493 - val_mean_absolute_error: 0.1767
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 2.5584 - mean_absolute_error: 1.3157 - val_loss: 0.0546 - val_mean_absolute_error: 0.1883
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 2.7810 - mean_absolute_error: 1.2576 - val_loss: 0.0598 - val_mean_absolute_error: 0.1994
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 6.5792 - mean_absolute_error: 1.4051 - val_loss: 0.0649 - val_mean_absolute_error: 0.2093
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 2.5569 - mean_absolute_error: 1.2913 - val_loss: 0.0730 - val_mean_absolute_error: 0.2239
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 3.2410 - mean_absolute_error: 1.4747 - val_loss: 0.0828 - val_mean_absolute_error: 0.2406
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 4.1737 - mean_absolute_error: 1.3523 - val_loss: 0.0917 - val_mean_absolute_error: 0.2551
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 5.6678 - mean_absolute_error: 1.3845 - val_loss: 0.0993 - val_mean_absolute_error: 0.2671
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 3.4806 - mean_absolute_error: 1.3210 - val_loss: 0.1082 - val_mean_absolute_error: 0.2802
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 3.3366 - mean_absolute_error: 1.3325 - val_loss: 0.1140 - val_mean_absolute_error: 0.2890
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.8170 - mean_absolute_error: 1.2766 - val_loss: 0.1167 - val_mean_absolute_error: 0.2938
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.6618 - mean_absolute_error: 1.3778 - val_loss: 0.1208 - val_mean_absolute_error: 0.3004
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.0973 - mean_absolute_error: 1.1367 - val_loss: 0.1237 - val_mean_absolute_error: 0.3061
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.6539 - mean_absolute_error: 1.0132 - val_loss: 0.1269 - val_mean_absolute_error: 0.3120
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 4.1515 - mean_absolute_error: 1.3120 - val_loss: 0.1270 - val_mean_absolute_error: 0.3143
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 3.1282 - mean_absolute_error: 1.2536 - val_loss: 0.1270 - val_mean_absolute_error: 0.3165
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.5640 - mean_absolute_error: 0.8881 - val_loss: 0.1273 - val_mean_absolute_error: 0.3187
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.8797 - mean_absolute_error: 1.0069 - val_loss: 0.1277 - val_mean_absolute_error: 0.3210
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.2755 - mean_absolute_error: 1.1730 - val_loss: 0.1295 - val_mean_absolute_error: 0.3252
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 8.0872 - mean_absolute_error: 1.6263 - val_loss: 0.1333 - val_mean_absolute_error: 0.3313
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 2.2579 - mean_absolute_error: 1.1091 - val_loss: 0.1380 - val_mean_absolute_error: 0.3383
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 93ms/step - loss: 1.8508 - mean_absolute_error: 1.0058 - val_loss: 0.1430 - val_mean_absolute_error: 0.3454
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.8154 - mean_absolute_error: 1.2407 - val_loss: 0.1495 - val_mean_absolute_error: 0.3541
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.0288 - mean_absolute_error: 0.7916 - val_loss: 0.1548 - val_mean_absolute_error: 0.3610
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 2.7334 - mean_absolute_error: 1.1137 - val_loss: 0.1582 - val_mean_absolute_error: 0.3654
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.3143 - mean_absolute_error: 0.8028 - val_loss: 0.1608 - val_mean_absolute_error: 0.3690
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 3.2720 - mean_absolute_error: 1.1530 - val_loss: 0.1625 - val_mean_absolute_error: 0.3715
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 3.4987 - mean_absolute_error: 1.1108 - val_loss: 0.1626 - val_mean_absolute_error: 0.3719
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 2.5671 - mean_absolute_error: 1.1763 - val_loss: 0.1614 - val_mean_absolute_error: 0.3709
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 3.9260 - mean_absolute_error: 1.3964 - val_loss: 0.1630 - val_mean_absolute_error: 0.3730
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.2950 - mean_absolute_error: 0.8374 - val_loss: 0.1639 - val_mean_absolute_error: 0.3743
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 2.4986 - mean_absolute_error: 1.1196 - val_loss: 0.1660 - val_mean_absolute_error: 0.3770
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.8172 - mean_absolute_error: 0.8938 - val_loss: 0.1684 - val_mean_absolute_error: 0.3799
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.6685 - mean_absolute_error: 1.0363 - val_loss: 0.1713 - val_mean_absolute_error: 0.3831
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.7359 - mean_absolute_error: 1.0182 - val_loss: 0.1751 - val_mean_absolute_error: 0.3875
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.3208 - mean_absolute_error: 1.0330 - val_loss: 0.1785 - val_mean_absolute_error: 0.3911
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 2.3649 - mean_absolute_error: 1.2109 - val_loss: 0.1849 - val_mean_absolute_error: 0.3980
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.1754 - mean_absolute_error: 0.7301 - val_loss: 0.1884 - val_mean_absolute_error: 0.4015
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 3.8501 - mean_absolute_error: 1.0839 - val_loss: 0.1883 - val_mean_absolute_error: 0.4015
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.2362 - mean_absolute_error: 0.9047 - val_loss: 0.1915 - val_mean_absolute_error: 0.4046
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 4.3540 - mean_absolute_error: 1.3450 - val_loss: 0.1945 - val_mean_absolute_error: 0.4076
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.7684 - mean_absolute_error: 1.0519 - val_loss: 0.1966 - val_mean_absolute_error: 0.4100
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 2.5302 - mean_absolute_error: 1.1413 - val_loss: 0.1964 - val_mean_absolute_error: 0.4101
Validation losses: [36.4707145690918, 0.9441620111465454, 4.268801212310791, 2.1217525005340576, 0.19637979567050934]
HPS: {'player_emb_dim': 32, 'dense_units': 48, 'dense_units_2': 96, 'learning_rate': 0.01, 'dropout_rate': 0.4, 'dropout_rate_2': 0.30000000000000004, 'dropout_rate_inter': 0.2, 'interaction_scale': 4}. MSE during RandomSearch: 2.2231674194335938. Starting evaluation across all k folds...

FOLD 1
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 2s 2s/step - loss: 3.8716 - mean_absolute_error: 1.5332 - val_loss: 37.2987 - val_mean_absolute_error: 4.6157
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 12ms/step - loss: 2.7277 - mean_absolute_error: 1.2815
2025-08-09 16:35:58.005271: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:35:58.005644: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 2.7277 - mean_absolute_error: 1.2815 - val_loss: 33.5419 - val_mean_absolute_error: 4.3385
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.6881 - mean_absolute_error: 0.8925 - val_loss: 33.8878 - val_mean_absolute_error: 4.5291
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.8812 - mean_absolute_error: 1.4035 - val_loss: 30.5761 - val_mean_absolute_error: 4.1791
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 2.4025 - mean_absolute_error: 1.2404 - val_loss: 33.3421 - val_mean_absolute_error: 4.3165
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.5197 - mean_absolute_error: 0.9653 - val_loss: 35.5964 - val_mean_absolute_error: 4.7087
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 2.1336 - mean_absolute_error: 1.0338 - val_loss: 36.0542 - val_mean_absolute_error: 4.6607
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.4602 - mean_absolute_error: 0.5485 - val_loss: 36.1532 - val_mean_absolute_error: 4.6007
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.7369 - mean_absolute_error: 0.6779 - val_loss: 37.6268 - val_mean_absolute_error: 4.7105
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 3.9392 - mean_absolute_error: 1.3405 - val_loss: 41.6393 - val_mean_absolute_error: 5.0372
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.8208 - mean_absolute_error: 0.7343 - val_loss: 43.7084 - val_mean_absolute_error: 5.3564
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.9796 - mean_absolute_error: 0.7735 - val_loss: 42.7305 - val_mean_absolute_error: 5.2587
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.6919 - mean_absolute_error: 1.0379 - val_loss: 42.9981 - val_mean_absolute_error: 5.1982
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.8388 - mean_absolute_error: 1.1248 - val_loss: 39.8703 - val_mean_absolute_error: 4.9524
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.4181 - mean_absolute_error: 0.8564 - val_loss: 37.5317 - val_mean_absolute_error: 4.7542
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.6867 - mean_absolute_error: 0.9657 - val_loss: 35.3372 - val_mean_absolute_error: 4.5503
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.1784 - mean_absolute_error: 0.8155 - val_loss: 35.8120 - val_mean_absolute_error: 4.5423
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.8982 - mean_absolute_error: 1.1206 - val_loss: 37.8731 - val_mean_absolute_error: 4.7612
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 2.2804 - mean_absolute_error: 1.1190 - val_loss: 38.6327 - val_mean_absolute_error: 4.8494
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 2.3756 - mean_absolute_error: 1.1373 - val_loss: 37.9562 - val_mean_absolute_error: 4.7841
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.2877 - mean_absolute_error: 0.8911 - val_loss: 37.4643 - val_mean_absolute_error: 4.7473
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.9974 - mean_absolute_error: 0.7459 - val_loss: 37.1311 - val_mean_absolute_error: 4.7236
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.7041 - mean_absolute_error: 0.6706 - val_loss: 37.3121 - val_mean_absolute_error: 4.7239
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.5036 - mean_absolute_error: 0.9372 - val_loss: 37.6094 - val_mean_absolute_error: 4.7374
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.6278 - mean_absolute_error: 0.9874 - val_loss: 38.2211 - val_mean_absolute_error: 4.7961
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.3140 - mean_absolute_error: 0.8398 - val_loss: 38.2061 - val_mean_absolute_error: 4.7808
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.3096 - mean_absolute_error: 0.6942 - val_loss: 38.6522 - val_mean_absolute_error: 4.7983
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.7095 - mean_absolute_error: 0.9042 - val_loss: 39.4233 - val_mean_absolute_error: 4.8339
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.6292 - mean_absolute_error: 0.9642 - val_loss: 39.3253 - val_mean_absolute_error: 4.8244
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.6525 - mean_absolute_error: 0.6061 - val_loss: 39.7157 - val_mean_absolute_error: 4.8186
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.1391 - mean_absolute_error: 0.7678 - val_loss: 39.1706 - val_mean_absolute_error: 4.7831
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.5205 - mean_absolute_error: 0.8981 - val_loss: 38.3074 - val_mean_absolute_error: 4.7714
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.9101 - mean_absolute_error: 0.6565 - val_loss: 37.7982 - val_mean_absolute_error: 4.7414
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.3162 - mean_absolute_error: 1.0034 - val_loss: 37.3351 - val_mean_absolute_error: 4.7068
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.1838 - mean_absolute_error: 0.7829 - val_loss: 36.4701 - val_mean_absolute_error: 4.6495
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.4289 - mean_absolute_error: 0.9324 - val_loss: 35.7724 - val_mean_absolute_error: 4.6074
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.4921 - mean_absolute_error: 0.5413 - val_loss: 35.1384 - val_mean_absolute_error: 4.5630
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.5734 - mean_absolute_error: 0.6155 - val_loss: 34.8254 - val_mean_absolute_error: 4.5353
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.7627 - mean_absolute_error: 0.8198 - val_loss: 35.6880 - val_mean_absolute_error: 4.6597
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.0723 - mean_absolute_error: 0.7414 - val_loss: 36.9585 - val_mean_absolute_error: 4.7986
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.3421 - mean_absolute_error: 0.4900 - val_loss: 38.2994 - val_mean_absolute_error: 4.9456
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.4710 - mean_absolute_error: 0.8992 - val_loss: 38.4526 - val_mean_absolute_error: 4.9604
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.8301 - mean_absolute_error: 0.7218 - val_loss: 38.4716 - val_mean_absolute_error: 4.9419
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.9825 - mean_absolute_error: 0.7455 - val_loss: 38.5947 - val_mean_absolute_error: 4.9104
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.6539 - mean_absolute_error: 0.5581 - val_loss: 38.1196 - val_mean_absolute_error: 4.8464
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 62ms/step - loss: 0.5317 - mean_absolute_error: 0.5177 - val_loss: 37.4187 - val_mean_absolute_error: 4.8045
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.3767 - mean_absolute_error: 0.4126 - val_loss: 37.3320 - val_mean_absolute_error: 4.8009
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.3799 - mean_absolute_error: 0.4286 - val_loss: 37.2207 - val_mean_absolute_error: 4.7853
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 1.0941 - mean_absolute_error: 0.6856 - val_loss: 37.0311 - val_mean_absolute_error: 4.7631
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.8840 - mean_absolute_error: 0.5239 - val_loss: 37.0546 - val_mean_absolute_error: 4.7540

FOLD 2
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 66ms/step - loss: 9.8301 - mean_absolute_error: 1.6274 - val_loss: 0.5799 - val_mean_absolute_error: 0.5677
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 10.6011 - mean_absolute_error: 1.7598 - val_loss: 0.5434 - val_mean_absolute_error: 0.5659
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 7.4461 - mean_absolute_error: 1.5720 - val_loss: 0.4270 - val_mean_absolute_error: 0.4637
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 6.6691 - mean_absolute_error: 1.7977 - val_loss: 0.3410 - val_mean_absolute_error: 0.3864
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 5.8538 - mean_absolute_error: 1.8192 - val_loss: 0.5646 - val_mean_absolute_error: 0.5455
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 6.8592 - mean_absolute_error: 1.8229 - val_loss: 0.7501 - val_mean_absolute_error: 0.6796
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 3.2997 - mean_absolute_error: 1.2501 - val_loss: 1.3444 - val_mean_absolute_error: 0.8988
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 97ms/step - loss: 2.5487 - mean_absolute_error: 1.0976 - val_loss: 2.6518 - val_mean_absolute_error: 1.1935
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 9.2835 - mean_absolute_error: 1.7666 - val_loss: 4.9984 - val_mean_absolute_error: 1.5210
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 4.5669 - mean_absolute_error: 1.6071 - val_loss: 5.5465 - val_mean_absolute_error: 1.5939
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 6.3305 - mean_absolute_error: 1.7511 - val_loss: 6.9192 - val_mean_absolute_error: 1.7039
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 4.2412 - mean_absolute_error: 1.5466 - val_loss: 6.4776 - val_mean_absolute_error: 1.6778
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 6.4870 - mean_absolute_error: 1.9103 - val_loss: 6.9884 - val_mean_absolute_error: 1.7714
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 4.3285 - mean_absolute_error: 1.3011 - val_loss: 8.8291 - val_mean_absolute_error: 1.9886
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 5.8411 - mean_absolute_error: 1.6060 - val_loss: 7.2195 - val_mean_absolute_error: 1.8393
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 3.3266 - mean_absolute_error: 1.2328 - val_loss: 6.7043 - val_mean_absolute_error: 1.7622
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.4864 - mean_absolute_error: 1.1166 - val_loss: 6.8817 - val_mean_absolute_error: 1.7311
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 2.2570 - mean_absolute_error: 1.1764 - val_loss: 7.7891 - val_mean_absolute_error: 1.8180
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.9409 - mean_absolute_error: 0.8024 - val_loss: 8.9823 - val_mean_absolute_error: 2.0430
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 2.6368 - mean_absolute_error: 1.1570 - val_loss: 8.8579 - val_mean_absolute_error: 2.1404
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 54ms/step - loss: 3.0280 - mean_absolute_error: 1.1672 - val_loss: 9.5680 - val_mean_absolute_error: 2.2176
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - loss: 1.1487 - mean_absolute_error: 0.7931 - val_loss: 9.2332 - val_mean_absolute_error: 2.1429
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 4.5266 - mean_absolute_error: 1.3656 - val_loss: 5.7100 - val_mean_absolute_error: 1.8093
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 1.2496 - mean_absolute_error: 0.8232 - val_loss: 2.6998 - val_mean_absolute_error: 1.3706
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 4.1529 - mean_absolute_error: 1.1568 - val_loss: 1.3995 - val_mean_absolute_error: 1.0433
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 3.1607 - mean_absolute_error: 1.1757 - val_loss: 0.9109 - val_mean_absolute_error: 0.8802
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 2.0000 - mean_absolute_error: 1.0435 - val_loss: 0.8327 - val_mean_absolute_error: 0.8607
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 73ms/step - loss: 1.5446 - mean_absolute_error: 0.8308 - val_loss: 0.9626 - val_mean_absolute_error: 0.9190
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 5.2868 - mean_absolute_error: 1.1974 - val_loss: 1.2542 - val_mean_absolute_error: 1.0184
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 3.2483 - mean_absolute_error: 1.1119 - val_loss: 1.8234 - val_mean_absolute_error: 1.1216
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 5.0022 - mean_absolute_error: 1.1130 - val_loss: 2.6326 - val_mean_absolute_error: 1.2131
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 2.9009 - mean_absolute_error: 1.2540 - val_loss: 3.5274 - val_mean_absolute_error: 1.3035
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.8271 - mean_absolute_error: 0.6653 - val_loss: 4.2743 - val_mean_absolute_error: 1.3948
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 2.6822 - mean_absolute_error: 1.1378 - val_loss: 5.2369 - val_mean_absolute_error: 1.5342
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 2.3598 - mean_absolute_error: 1.0898 - val_loss: 5.6643 - val_mean_absolute_error: 1.6302
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 1.2778 - mean_absolute_error: 0.8463 - val_loss: 5.7560 - val_mean_absolute_error: 1.6841
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.1347 - mean_absolute_error: 0.7158 - val_loss: 4.6112 - val_mean_absolute_error: 1.5828
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.5684 - mean_absolute_error: 0.9440 - val_loss: 3.6779 - val_mean_absolute_error: 1.4767
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 3.0922 - mean_absolute_error: 1.0925 - val_loss: 2.1743 - val_mean_absolute_error: 1.2421
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 2.7327 - mean_absolute_error: 1.1857 - val_loss: 1.7425 - val_mean_absolute_error: 1.1782
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.8115 - mean_absolute_error: 0.6796 - val_loss: 1.5755 - val_mean_absolute_error: 1.1500
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 2.7335 - mean_absolute_error: 0.9761 - val_loss: 1.6122 - val_mean_absolute_error: 1.1677
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.9154 - mean_absolute_error: 0.7509 - val_loss: 1.6899 - val_mean_absolute_error: 1.1698
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 2.9170 - mean_absolute_error: 1.1905 - val_loss: 2.3872 - val_mean_absolute_error: 1.3070
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 49ms/step - loss: 3.3340 - mean_absolute_error: 1.1674 - val_loss: 3.2587 - val_mean_absolute_error: 1.4466
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step - loss: 0.5103 - mean_absolute_error: 0.5860 - val_loss: 4.4546 - val_mean_absolute_error: 1.6143
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 0.8328 - mean_absolute_error: 0.7121 - val_loss: 5.9179 - val_mean_absolute_error: 1.8179
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 49ms/step - loss: 0.8315 - mean_absolute_error: 0.6610 - val_loss: 6.1753 - val_mean_absolute_error: 1.8511
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 2.7361 - mean_absolute_error: 1.0520 - val_loss: 6.4589 - val_mean_absolute_error: 1.8382
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.3569 - mean_absolute_error: 0.4567 - val_loss: 6.8180 - val_mean_absolute_error: 1.8187

FOLD 3
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 60ms/step - loss: 3.2995 - mean_absolute_error: 1.2709 - val_loss: 0.2082 - val_mean_absolute_error: 0.3551
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 9.4353 - mean_absolute_error: 1.4539 - val_loss: 0.2479 - val_mean_absolute_error: 0.3902
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 2.6063 - mean_absolute_error: 1.2739 - val_loss: 0.2450 - val_mean_absolute_error: 0.3486
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.0705 - mean_absolute_error: 0.7266 - val_loss: 0.2506 - val_mean_absolute_error: 0.2849
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 6.0069 - mean_absolute_error: 1.3128 - val_loss: 0.2392 - val_mean_absolute_error: 0.3154
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.3021 - mean_absolute_error: 0.8857 - val_loss: 0.2493 - val_mean_absolute_error: 0.3974
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 5.8054 - mean_absolute_error: 1.4124 - val_loss: 0.3263 - val_mean_absolute_error: 0.4317
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 3.1302 - mean_absolute_error: 1.2222 - val_loss: 0.4336 - val_mean_absolute_error: 0.4511
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 3.5523 - mean_absolute_error: 1.1729 - val_loss: 0.4734 - val_mean_absolute_error: 0.4569
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 4.2947 - mean_absolute_error: 1.2312 - val_loss: 0.5354 - val_mean_absolute_error: 0.5308
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - loss: 2.9136 - mean_absolute_error: 1.2319 - val_loss: 0.5589 - val_mean_absolute_error: 0.5460
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.7851 - mean_absolute_error: 1.0122 - val_loss: 0.4805 - val_mean_absolute_error: 0.4971
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 101ms/step - loss: 3.3866 - mean_absolute_error: 1.2841 - val_loss: 0.3009 - val_mean_absolute_error: 0.4042
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - loss: 1.7746 - mean_absolute_error: 0.9953 - val_loss: 0.2001 - val_mean_absolute_error: 0.3574
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - loss: 6.0025 - mean_absolute_error: 1.4660 - val_loss: 0.2878 - val_mean_absolute_error: 0.4547
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step - loss: 2.0689 - mean_absolute_error: 1.0030 - val_loss: 0.4336 - val_mean_absolute_error: 0.6143
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 1.4069 - mean_absolute_error: 0.9028 - val_loss: 0.5949 - val_mean_absolute_error: 0.7285
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 1.3783 - mean_absolute_error: 0.8792 - val_loss: 0.5974 - val_mean_absolute_error: 0.7316
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 77ms/step - loss: 1.1520 - mean_absolute_error: 0.8640 - val_loss: 0.5137 - val_mean_absolute_error: 0.6741
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 3.8414 - mean_absolute_error: 1.4097 - val_loss: 0.4381 - val_mean_absolute_error: 0.5764
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.9138 - mean_absolute_error: 0.7805 - val_loss: 0.3768 - val_mean_absolute_error: 0.5377
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.6052 - mean_absolute_error: 0.9654 - val_loss: 0.3094 - val_mean_absolute_error: 0.4954
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.5700 - mean_absolute_error: 0.9442 - val_loss: 0.3772 - val_mean_absolute_error: 0.5894
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 2.3373 - mean_absolute_error: 0.9851 - val_loss: 0.5505 - val_mean_absolute_error: 0.6508
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.9357 - mean_absolute_error: 0.7473 - val_loss: 0.6811 - val_mean_absolute_error: 0.6748
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.9187 - mean_absolute_error: 1.0725 - val_loss: 0.8758 - val_mean_absolute_error: 0.7307
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 2.5504 - mean_absolute_error: 1.2154 - val_loss: 0.9559 - val_mean_absolute_error: 0.8227
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.4012 - mean_absolute_error: 0.7524 - val_loss: 0.8856 - val_mean_absolute_error: 0.8075
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 3.3111 - mean_absolute_error: 1.1224 - val_loss: 0.5365 - val_mean_absolute_error: 0.5578
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.9912 - mean_absolute_error: 0.8120 - val_loss: 0.2653 - val_mean_absolute_error: 0.3697
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.6224 - mean_absolute_error: 1.0734 - val_loss: 0.2350 - val_mean_absolute_error: 0.2921
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 2.7104 - mean_absolute_error: 1.3346 - val_loss: 0.2343 - val_mean_absolute_error: 0.2799
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 61ms/step - loss: 1.0374 - mean_absolute_error: 0.7201 - val_loss: 0.2855 - val_mean_absolute_error: 0.3886
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 74ms/step - loss: 1.7169 - mean_absolute_error: 0.8783 - val_loss: 0.6001 - val_mean_absolute_error: 0.6349
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 69ms/step - loss: 2.2019 - mean_absolute_error: 1.1875 - val_loss: 0.9232 - val_mean_absolute_error: 0.8041
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 2.4347 - mean_absolute_error: 1.1563 - val_loss: 1.2191 - val_mean_absolute_error: 0.9251
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - loss: 8.1863 - mean_absolute_error: 1.4373 - val_loss: 1.4596 - val_mean_absolute_error: 1.0141
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.5530 - mean_absolute_error: 0.9449 - val_loss: 1.5614 - val_mean_absolute_error: 1.0522
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 2.7578 - mean_absolute_error: 1.0192 - val_loss: 1.6024 - val_mean_absolute_error: 1.0804
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.5935 - mean_absolute_error: 1.0291 - val_loss: 1.5467 - val_mean_absolute_error: 1.0818
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.4288 - mean_absolute_error: 0.9239 - val_loss: 1.5303 - val_mean_absolute_error: 1.1024
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.2227 - mean_absolute_error: 0.6888 - val_loss: 1.6241 - val_mean_absolute_error: 1.1302
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.2348 - mean_absolute_error: 0.7875 - val_loss: 1.6873 - val_mean_absolute_error: 1.1467
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 67ms/step - loss: 1.1072 - mean_absolute_error: 0.7321 - val_loss: 1.7023 - val_mean_absolute_error: 1.1481
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.2536 - mean_absolute_error: 0.9396 - val_loss: 1.6739 - val_mean_absolute_error: 1.1396
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.0379 - mean_absolute_error: 0.6678 - val_loss: 1.5841 - val_mean_absolute_error: 1.1173
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 1.4183 - mean_absolute_error: 0.7980 - val_loss: 1.4237 - val_mean_absolute_error: 1.0810
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 2.5878 - mean_absolute_error: 0.9918 - val_loss: 1.4198 - val_mean_absolute_error: 1.0832
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - loss: 2.4970 - mean_absolute_error: 0.9552 - val_loss: 1.4388 - val_mean_absolute_error: 1.0956
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.3169 - mean_absolute_error: 0.8503 - val_loss: 1.6271 - val_mean_absolute_error: 1.1499

FOLD 4
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 63ms/step - loss: 0.8686 - mean_absolute_error: 0.6890 - val_loss: 0.5397 - val_mean_absolute_error: 0.5434
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.2513 - mean_absolute_error: 0.9023 - val_loss: 0.5585 - val_mean_absolute_error: 0.5277
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 1.0561 - mean_absolute_error: 0.8306 - val_loss: 0.5740 - val_mean_absolute_error: 0.5432
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.0939 - mean_absolute_error: 0.8117 - val_loss: 0.6413 - val_mean_absolute_error: 0.5471
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 11ms/step - loss: 1.2377 - mean_absolute_error: 0.7611
2025-08-09 16:36:04.903862: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:36:04.904177: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.2377 - mean_absolute_error: 0.7611 - val_loss: 0.6971 - val_mean_absolute_error: 0.5432
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 1.5116 - mean_absolute_error: 0.8979 - val_loss: 0.7286 - val_mean_absolute_error: 0.5299
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 2.0390 - mean_absolute_error: 1.0532 - val_loss: 0.7244 - val_mean_absolute_error: 0.5145
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.7219 - mean_absolute_error: 0.5637 - val_loss: 0.7315 - val_mean_absolute_error: 0.5071
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.6912 - mean_absolute_error: 0.6141 - val_loss: 0.7798 - val_mean_absolute_error: 0.5452
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 1.0737 - mean_absolute_error: 0.8224 - val_loss: 0.8245 - val_mean_absolute_error: 0.5718
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.6209 - mean_absolute_error: 0.5848 - val_loss: 0.8658 - val_mean_absolute_error: 0.6004
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.9486 - mean_absolute_error: 0.6546 - val_loss: 0.8603 - val_mean_absolute_error: 0.6091
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.6810 - mean_absolute_error: 0.8566 - val_loss: 0.8445 - val_mean_absolute_error: 0.5941
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.7237 - mean_absolute_error: 0.6158 - val_loss: 0.8244 - val_mean_absolute_error: 0.5628
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 2.1627 - mean_absolute_error: 0.8576 - val_loss: 0.8234 - val_mean_absolute_error: 0.5490
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.7599 - mean_absolute_error: 0.6237 - val_loss: 0.7998 - val_mean_absolute_error: 0.5355
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.6820 - mean_absolute_error: 0.6474 - val_loss: 0.7934 - val_mean_absolute_error: 0.5253
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.7718 - mean_absolute_error: 0.5633 - val_loss: 0.7951 - val_mean_absolute_error: 0.5175
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 2.7422 - mean_absolute_error: 1.0684 - val_loss: 0.7958 - val_mean_absolute_error: 0.5112
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 3.0137 - mean_absolute_error: 0.8504 - val_loss: 0.9170 - val_mean_absolute_error: 0.5773
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 1.4211 - mean_absolute_error: 0.6543 - val_loss: 1.0312 - val_mean_absolute_error: 0.6294
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 2.3238 - mean_absolute_error: 1.1176 - val_loss: 1.0781 - val_mean_absolute_error: 0.6420
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 1.3539 - mean_absolute_error: 0.7221 - val_loss: 1.1003 - val_mean_absolute_error: 0.6557
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.3888 - mean_absolute_error: 0.8827 - val_loss: 1.0991 - val_mean_absolute_error: 0.6544
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 101ms/step - loss: 0.8874 - mean_absolute_error: 0.7133 - val_loss: 0.9748 - val_mean_absolute_error: 0.5997
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 80ms/step - loss: 1.0970 - mean_absolute_error: 0.6359 - val_loss: 0.8816 - val_mean_absolute_error: 0.5540
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 2.6170 - mean_absolute_error: 1.0522 - val_loss: 0.8001 - val_mean_absolute_error: 0.5463
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.7006 - mean_absolute_error: 0.5675 - val_loss: 0.7777 - val_mean_absolute_error: 0.5704
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 2.7394 - mean_absolute_error: 0.8326 - val_loss: 0.7465 - val_mean_absolute_error: 0.5832
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.6489 - mean_absolute_error: 0.6225 - val_loss: 0.6942 - val_mean_absolute_error: 0.5725
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.9127 - mean_absolute_error: 0.6096 - val_loss: 0.6485 - val_mean_absolute_error: 0.5630
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.6758 - mean_absolute_error: 0.6163 - val_loss: 0.6354 - val_mean_absolute_error: 0.5600
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.4056 - mean_absolute_error: 0.5054 - val_loss: 0.6119 - val_mean_absolute_error: 0.5403
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 3.7620 - mean_absolute_error: 0.9853 - val_loss: 0.6152 - val_mean_absolute_error: 0.5309
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.5795 - mean_absolute_error: 0.5150 - val_loss: 0.6317 - val_mean_absolute_error: 0.5279
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.4914 - mean_absolute_error: 0.9063 - val_loss: 0.5676 - val_mean_absolute_error: 0.4944
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.8275 - mean_absolute_error: 0.6424 - val_loss: 0.5095 - val_mean_absolute_error: 0.4646
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.8021 - mean_absolute_error: 0.5943 - val_loss: 0.4433 - val_mean_absolute_error: 0.4260
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.2297 - mean_absolute_error: 0.8251 - val_loss: 0.4428 - val_mean_absolute_error: 0.4207
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.6274 - mean_absolute_error: 0.5909 - val_loss: 0.4487 - val_mean_absolute_error: 0.4162
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.4570 - mean_absolute_error: 0.4742 - val_loss: 0.4430 - val_mean_absolute_error: 0.4007
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.0309 - mean_absolute_error: 0.6928 - val_loss: 0.5007 - val_mean_absolute_error: 0.4177
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.1099 - mean_absolute_error: 0.7542 - val_loss: 0.7046 - val_mean_absolute_error: 0.5161
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 1.0539 - mean_absolute_error: 0.7684 - val_loss: 0.9956 - val_mean_absolute_error: 0.6041
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 5.6703 - mean_absolute_error: 1.3031 - val_loss: 1.2962 - val_mean_absolute_error: 0.6688
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 37ms/step - loss: 0.7662 - mean_absolute_error: 0.6114 - val_loss: 1.3601 - val_mean_absolute_error: 0.6862
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.7958 - mean_absolute_error: 0.6419 - val_loss: 1.4840 - val_mean_absolute_error: 0.7125
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.8327 - mean_absolute_error: 0.8074 - val_loss: 1.6328 - val_mean_absolute_error: 0.7501
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.9914 - mean_absolute_error: 0.6528 - val_loss: 1.6806 - val_mean_absolute_error: 0.7602
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.7831 - mean_absolute_error: 0.7010 - val_loss: 1.7416 - val_mean_absolute_error: 0.7825

FOLD 5
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 67ms/step - loss: 4.5161 - mean_absolute_error: 1.2121 - val_loss: 0.7625 - val_mean_absolute_error: 0.5190
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.1670 - mean_absolute_error: 0.8850 - val_loss: 0.9558 - val_mean_absolute_error: 0.6055
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 70ms/step - loss: 1.4656 - mean_absolute_error: 0.8923 - val_loss: 1.2820 - val_mean_absolute_error: 0.7341
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.8576 - mean_absolute_error: 0.6265 - val_loss: 1.6144 - val_mean_absolute_error: 0.8587
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 2.9819 - mean_absolute_error: 1.2169 - val_loss: 1.7738 - val_mean_absolute_error: 0.9288
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 1.6938 - mean_absolute_error: 0.9929 - val_loss: 1.7577 - val_mean_absolute_error: 0.9283
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 1.4667 - mean_absolute_error: 1.0036 - val_loss: 1.6509 - val_mean_absolute_error: 0.8951
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.6997 - mean_absolute_error: 0.6922 - val_loss: 1.4756 - val_mean_absolute_error: 0.8458
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 3.2132 - mean_absolute_error: 1.1927 - val_loss: 1.1334 - val_mean_absolute_error: 0.7575
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.5128 - mean_absolute_error: 0.8096 - val_loss: 0.7692 - val_mean_absolute_error: 0.6489
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 6.2823 - mean_absolute_error: 1.2894 - val_loss: 0.7356 - val_mean_absolute_error: 0.6273
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 2.2424 - mean_absolute_error: 0.9812 - val_loss: 0.8433 - val_mean_absolute_error: 0.6548
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.7686 - mean_absolute_error: 0.7216 - val_loss: 1.0294 - val_mean_absolute_error: 0.7088
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.7684 - mean_absolute_error: 0.7970 - val_loss: 1.1968 - val_mean_absolute_error: 0.7693
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.9754 - mean_absolute_error: 0.9534 - val_loss: 1.4295 - val_mean_absolute_error: 0.8412
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 1.1052 - mean_absolute_error: 0.7569 - val_loss: 1.5148 - val_mean_absolute_error: 0.8779
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.9308 - mean_absolute_error: 0.6756 - val_loss: 1.5481 - val_mean_absolute_error: 0.9066
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.5537 - mean_absolute_error: 0.9322 - val_loss: 1.5575 - val_mean_absolute_error: 0.9113
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 1.2556 - mean_absolute_error: 0.8380 - val_loss: 1.4159 - val_mean_absolute_error: 0.8691
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - loss: 1.8831 - mean_absolute_error: 0.8799 - val_loss: 1.3296 - val_mean_absolute_error: 0.8479
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.2277 - mean_absolute_error: 0.7906 - val_loss: 1.1809 - val_mean_absolute_error: 0.8107
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.1007 - mean_absolute_error: 0.7879 - val_loss: 1.0187 - val_mean_absolute_error: 0.7712
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.0470 - mean_absolute_error: 0.6742 - val_loss: 0.9397 - val_mean_absolute_error: 0.7602
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 5.0702 - mean_absolute_error: 1.1465 - val_loss: 1.1844 - val_mean_absolute_error: 0.8501
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.9681 - mean_absolute_error: 0.7082 - val_loss: 1.2979 - val_mean_absolute_error: 0.8912
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.5989 - mean_absolute_error: 0.8135 - val_loss: 1.5944 - val_mean_absolute_error: 0.9813
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 1.4157 - mean_absolute_error: 0.8267 - val_loss: 1.8144 - val_mean_absolute_error: 1.0405
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 1.2569 - mean_absolute_error: 0.6729 - val_loss: 2.1043 - val_mean_absolute_error: 1.1131
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.6089 - mean_absolute_error: 0.6383 - val_loss: 2.2191 - val_mean_absolute_error: 1.1457
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 75ms/step - loss: 2.5868 - mean_absolute_error: 0.9243 - val_loss: 2.2563 - val_mean_absolute_error: 1.1545
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.0213 - mean_absolute_error: 0.7308 - val_loss: 2.4672 - val_mean_absolute_error: 1.1985
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.3107 - mean_absolute_error: 0.4506 - val_loss: 2.5562 - val_mean_absolute_error: 1.2071
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 2.4006 - mean_absolute_error: 1.0324 - val_loss: 2.4536 - val_mean_absolute_error: 1.1725
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.6917 - mean_absolute_error: 0.6428 - val_loss: 2.2952 - val_mean_absolute_error: 1.1293
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 1.7352 - mean_absolute_error: 0.6847 - val_loss: 2.1131 - val_mean_absolute_error: 1.0824
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.4073 - mean_absolute_error: 0.8321 - val_loss: 1.8005 - val_mean_absolute_error: 1.0125
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.4550 - mean_absolute_error: 0.5479 - val_loss: 1.6684 - val_mean_absolute_error: 1.0005
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.2732 - mean_absolute_error: 0.7863 - val_loss: 1.5783 - val_mean_absolute_error: 0.9936
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.6584 - mean_absolute_error: 0.7358 - val_loss: 2.1165 - val_mean_absolute_error: 1.1210
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.6429 - mean_absolute_error: 0.5550 - val_loss: 2.4622 - val_mean_absolute_error: 1.1865
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.0218 - mean_absolute_error: 0.7705 - val_loss: 2.5630 - val_mean_absolute_error: 1.1940
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 1.5231 - mean_absolute_error: 0.7148 - val_loss: 2.4609 - val_mean_absolute_error: 1.1681
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.4298 - mean_absolute_error: 0.4905 - val_loss: 2.3227 - val_mean_absolute_error: 1.1450
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.7448 - mean_absolute_error: 0.9039 - val_loss: 2.0273 - val_mean_absolute_error: 1.0889
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.5961 - mean_absolute_error: 0.7641 - val_loss: 1.6029 - val_mean_absolute_error: 0.9959
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.7971 - mean_absolute_error: 0.6044 - val_loss: 1.1716 - val_mean_absolute_error: 0.8834
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 2.6495 - mean_absolute_error: 1.0060 - val_loss: 0.9503 - val_mean_absolute_error: 0.8194
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.9393 - mean_absolute_error: 0.7870 - val_loss: 0.7973 - val_mean_absolute_error: 0.7680
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 5.6515 - mean_absolute_error: 1.1508 - val_loss: 0.8206 - val_mean_absolute_error: 0.7653
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 2.1961 - mean_absolute_error: 0.8073 - val_loss: 0.8114 - val_mean_absolute_error: 0.7470
Validation losses: [37.054588317871094, 6.8180155754089355, 1.6270540952682495, 1.7415810823440552, 0.811400830745697]
HPS: {'player_emb_dim': 32, 'dense_units': 80, 'dense_units_2': 48, 'learning_rate': 0.01, 'dropout_rate': 0.4, 'dropout_rate_2': 0.30000000000000004, 'dropout_rate_inter': 0.30000000000000004, 'interaction_scale': 2}. MSE during RandomSearch: 1.496654987335205. Starting evaluation across all k folds...

FOLD 1
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 2s 2s/step - loss: 3.7210 - mean_absolute_error: 1.5711 - val_loss: 34.5427 - val_mean_absolute_error: 4.7565
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.8363 - mean_absolute_error: 1.0348 - val_loss: 28.9503 - val_mean_absolute_error: 4.0909
2025-08-09 16:36:10.998000: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:36:10.998497: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - loss: 3.3243 - mean_absolute_error: 1.3124 - val_loss: 35.4130 - val_mean_absolute_error: 4.4901
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 2.4323 - mean_absolute_error: 1.2764 - val_loss: 41.9434 - val_mean_absolute_error: 4.9682
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 3.6580 - mean_absolute_error: 1.7012 - val_loss: 34.4222 - val_mean_absolute_error: 4.4396
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.7410 - mean_absolute_error: 0.6532 - val_loss: 35.3162 - val_mean_absolute_error: 4.5263
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 74ms/step - loss: 1.7370 - mean_absolute_error: 0.9205 - val_loss: 36.0967 - val_mean_absolute_error: 4.5757
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 3.4018 - mean_absolute_error: 1.3170 - val_loss: 36.7466 - val_mean_absolute_error: 4.6703
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 1.2336 - mean_absolute_error: 0.8241 - val_loss: 37.7112 - val_mean_absolute_error: 4.7718
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.8628 - mean_absolute_error: 0.7345 - val_loss: 38.7637 - val_mean_absolute_error: 4.8612
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.8821 - mean_absolute_error: 0.6933 - val_loss: 36.4464 - val_mean_absolute_error: 4.6951
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.7756 - mean_absolute_error: 0.6686 - val_loss: 32.9763 - val_mean_absolute_error: 4.4052
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 1.2156 - mean_absolute_error: 0.9270 - val_loss: 34.0385 - val_mean_absolute_error: 4.4662
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.1433 - mean_absolute_error: 0.8484 - val_loss: 35.1349 - val_mean_absolute_error: 4.5820
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.9686 - mean_absolute_error: 0.7778 - val_loss: 35.2302 - val_mean_absolute_error: 4.6050
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 1.8251 - mean_absolute_error: 1.0423 - val_loss: 35.4015 - val_mean_absolute_error: 4.5586
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.7214 - mean_absolute_error: 0.6472 - val_loss: 35.3966 - val_mean_absolute_error: 4.4649
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.9525 - mean_absolute_error: 0.7787 - val_loss: 37.4102 - val_mean_absolute_error: 4.6341
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 1.2566 - mean_absolute_error: 0.7391 - val_loss: 39.5372 - val_mean_absolute_error: 4.7884
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.7005 - mean_absolute_error: 0.6898 - val_loss: 40.6496 - val_mean_absolute_error: 4.8998
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.4995 - mean_absolute_error: 0.5393 - val_loss: 41.0740 - val_mean_absolute_error: 5.0408
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.6622 - mean_absolute_error: 0.6672 - val_loss: 41.1350 - val_mean_absolute_error: 5.0899
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.1085 - mean_absolute_error: 0.7276 - val_loss: 40.2064 - val_mean_absolute_error: 5.0408
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.3657 - mean_absolute_error: 0.9123 - val_loss: 38.9300 - val_mean_absolute_error: 4.9436
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.1012 - mean_absolute_error: 0.8442 - val_loss: 38.1059 - val_mean_absolute_error: 4.8541
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.7524 - mean_absolute_error: 0.6327 - val_loss: 37.0089 - val_mean_absolute_error: 4.7293
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.7436 - mean_absolute_error: 0.6241 - val_loss: 35.6927 - val_mean_absolute_error: 4.5921
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.4081 - mean_absolute_error: 0.4737 - val_loss: 34.8903 - val_mean_absolute_error: 4.4922
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.3714 - mean_absolute_error: 0.8551 - val_loss: 34.4240 - val_mean_absolute_error: 4.4451
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.4990 - mean_absolute_error: 1.0350 - val_loss: 35.2764 - val_mean_absolute_error: 4.5162
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.6489 - mean_absolute_error: 0.8233 - val_loss: 36.1011 - val_mean_absolute_error: 4.5806
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.8692 - mean_absolute_error: 0.7413 - val_loss: 36.8024 - val_mean_absolute_error: 4.6702
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.9722 - mean_absolute_error: 0.7046 - val_loss: 36.6290 - val_mean_absolute_error: 4.6681
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.8396 - mean_absolute_error: 0.6250 - val_loss: 36.0776 - val_mean_absolute_error: 4.6131
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 1.4414 - mean_absolute_error: 0.8088 - val_loss: 35.8098 - val_mean_absolute_error: 4.6157
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 72ms/step - loss: 0.4896 - mean_absolute_error: 0.4776 - val_loss: 35.8605 - val_mean_absolute_error: 4.6569
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.6489 - mean_absolute_error: 0.6036 - val_loss: 35.4340 - val_mean_absolute_error: 4.6566
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.5583 - mean_absolute_error: 0.8391 - val_loss: 35.7673 - val_mean_absolute_error: 4.6651
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 0.4290 - mean_absolute_error: 0.5624 - val_loss: 36.5483 - val_mean_absolute_error: 4.6955
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 0.7766 - mean_absolute_error: 0.6422 - val_loss: 37.2169 - val_mean_absolute_error: 4.7229
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.7004 - mean_absolute_error: 0.6357 - val_loss: 36.9219 - val_mean_absolute_error: 4.6632
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 65ms/step - loss: 0.9322 - mean_absolute_error: 0.7613 - val_loss: 36.6968 - val_mean_absolute_error: 4.6238
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 144ms/step - loss: 1.7522 - mean_absolute_error: 1.0594 - val_loss: 35.2018 - val_mean_absolute_error: 4.5283
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.8147 - mean_absolute_error: 0.6997 - val_loss: 35.1046 - val_mean_absolute_error: 4.5330
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.4694 - mean_absolute_error: 0.7694 - val_loss: 36.5052 - val_mean_absolute_error: 4.6939
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.8863 - mean_absolute_error: 0.6848 - val_loss: 37.6941 - val_mean_absolute_error: 4.8076
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 1.4218 - mean_absolute_error: 0.7976 - val_loss: 38.2018 - val_mean_absolute_error: 4.8610
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 1.1149 - mean_absolute_error: 0.7682 - val_loss: 38.2632 - val_mean_absolute_error: 4.8645
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.9307 - mean_absolute_error: 0.7263 - val_loss: 38.4224 - val_mean_absolute_error: 4.8668
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.6766 - mean_absolute_error: 0.5739 - val_loss: 38.3986 - val_mean_absolute_error: 4.8444

FOLD 2
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 68ms/step - loss: 10.3977 - mean_absolute_error: 1.8171 - val_loss: 0.3709 - val_mean_absolute_error: 0.5324
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 10.7669 - mean_absolute_error: 1.6626 - val_loss: 0.5376 - val_mean_absolute_error: 0.6296
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 10.8175 - mean_absolute_error: 1.8019 - val_loss: 0.7449 - val_mean_absolute_error: 0.7957
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 9.0922 - mean_absolute_error: 1.7373 - val_loss: 0.8356 - val_mean_absolute_error: 0.8846
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 10.8008 - mean_absolute_error: 1.9869 - val_loss: 0.8856 - val_mean_absolute_error: 0.9098
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 9.8349 - mean_absolute_error: 1.5839 - val_loss: 0.8180 - val_mean_absolute_error: 0.8714
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 11.2021 - mean_absolute_error: 2.0833 - val_loss: 0.6865 - val_mean_absolute_error: 0.7894
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 8.1435 - mean_absolute_error: 1.5470 - val_loss: 0.4519 - val_mean_absolute_error: 0.6493
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 9.3905 - mean_absolute_error: 1.6877 - val_loss: 0.3162 - val_mean_absolute_error: 0.5264
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 9.0553 - mean_absolute_error: 1.7254 - val_loss: 0.1665 - val_mean_absolute_error: 0.3367
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 8.9955 - mean_absolute_error: 1.6012 - val_loss: 0.1121 - val_mean_absolute_error: 0.2020
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 4.2297 - mean_absolute_error: 1.2816 - val_loss: 0.1995 - val_mean_absolute_error: 0.3985
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - loss: 6.7150 - mean_absolute_error: 1.6721 - val_loss: 0.4350 - val_mean_absolute_error: 0.5460
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 5.0817 - mean_absolute_error: 1.5249 - val_loss: 0.9711 - val_mean_absolute_error: 0.6362
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 75ms/step - loss: 4.7431 - mean_absolute_error: 1.3664 - val_loss: 2.4428 - val_mean_absolute_error: 0.9875
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 3.6284 - mean_absolute_error: 1.4626 - val_loss: 4.9898 - val_mean_absolute_error: 1.3732
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.8366 - mean_absolute_error: 1.1607 - val_loss: 7.1207 - val_mean_absolute_error: 1.6520
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.7527 - mean_absolute_error: 1.1736 - val_loss: 8.5397 - val_mean_absolute_error: 1.7889
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 5.6332 - mean_absolute_error: 1.5749 - val_loss: 12.4290 - val_mean_absolute_error: 2.0763
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 3.0022 - mean_absolute_error: 1.2904 - val_loss: 17.0695 - val_mean_absolute_error: 2.3884
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 3.3451 - mean_absolute_error: 1.5286 - val_loss: 18.3452 - val_mean_absolute_error: 2.5286
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 3.8391 - mean_absolute_error: 1.6096 - val_loss: 18.9633 - val_mean_absolute_error: 2.5866
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 2.6436 - mean_absolute_error: 1.1878 - val_loss: 21.0639 - val_mean_absolute_error: 2.7153
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 2.6184 - mean_absolute_error: 1.1420 - val_loss: 23.9266 - val_mean_absolute_error: 2.9875
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 2.7048 - mean_absolute_error: 1.2921 - val_loss: 30.1170 - val_mean_absolute_error: 3.4042
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 7.9070 - mean_absolute_error: 1.8940 - val_loss: 26.6850 - val_mean_absolute_error: 3.2200
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 102ms/step - loss: 9.0066 - mean_absolute_error: 1.6620 - val_loss: 14.6248 - val_mean_absolute_error: 2.5529
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 5.7261 - mean_absolute_error: 1.4562 - val_loss: 9.0177 - val_mean_absolute_error: 2.0898
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 4.2582 - mean_absolute_error: 1.2728 - val_loss: 3.7769 - val_mean_absolute_error: 1.4783
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.9396 - mean_absolute_error: 1.2323 - val_loss: 1.7076 - val_mean_absolute_error: 1.1072
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 2.8993 - mean_absolute_error: 1.1341 - val_loss: 0.8488 - val_mean_absolute_error: 0.6521
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 6.1176 - mean_absolute_error: 1.5086 - val_loss: 1.2346 - val_mean_absolute_error: 0.8309
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 56ms/step - loss: 2.6364 - mean_absolute_error: 1.2340 - val_loss: 1.1521 - val_mean_absolute_error: 0.7671
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 72ms/step - loss: 3.3884 - mean_absolute_error: 1.3278 - val_loss: 0.9000 - val_mean_absolute_error: 0.6558
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 3.3870 - mean_absolute_error: 1.1645 - val_loss: 1.1209 - val_mean_absolute_error: 0.9501
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 1.9493 - mean_absolute_error: 0.9281 - val_loss: 2.3218 - val_mean_absolute_error: 1.2794
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 3.0540 - mean_absolute_error: 1.4105 - val_loss: 5.3788 - val_mean_absolute_error: 1.5886
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 2.1390 - mean_absolute_error: 1.2595 - val_loss: 10.4942 - val_mean_absolute_error: 2.1628
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 4.2659 - mean_absolute_error: 1.3408 - val_loss: 11.7578 - val_mean_absolute_error: 2.2365
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 11.9589 - mean_absolute_error: 1.7832 - val_loss: 6.1111 - val_mean_absolute_error: 1.6407
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.2912 - mean_absolute_error: 0.8548 - val_loss: 3.0393 - val_mean_absolute_error: 1.1426
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 7.2237 - mean_absolute_error: 1.4685 - val_loss: 1.5135 - val_mean_absolute_error: 0.8743
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 71ms/step - loss: 1.7085 - mean_absolute_error: 1.0127 - val_loss: 0.9761 - val_mean_absolute_error: 0.7643
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 1.5228 - mean_absolute_error: 0.9516 - val_loss: 1.1316 - val_mean_absolute_error: 0.7749
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 4.6946 - mean_absolute_error: 1.4435 - val_loss: 1.5199 - val_mean_absolute_error: 0.9399
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 4.0806 - mean_absolute_error: 1.1954 - val_loss: 2.1058 - val_mean_absolute_error: 1.0525
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 5.6537 - mean_absolute_error: 1.3125 - val_loss: 3.5584 - val_mean_absolute_error: 1.3240
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 1.5445 - mean_absolute_error: 0.7862 - val_loss: 6.0704 - val_mean_absolute_error: 1.6397
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 2.9241 - mean_absolute_error: 1.2997 - val_loss: 9.5385 - val_mean_absolute_error: 1.9690
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 5.0524 - mean_absolute_error: 1.2627 - val_loss: 14.6121 - val_mean_absolute_error: 2.3573

FOLD 3
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 69ms/step - loss: 2.2879 - mean_absolute_error: 1.2556 - val_loss: 2.4234 - val_mean_absolute_error: 1.1912
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 16.7499 - mean_absolute_error: 2.4318 - val_loss: 1.2347 - val_mean_absolute_error: 0.8620
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 7.0897 - mean_absolute_error: 1.7321 - val_loss: 0.6307 - val_mean_absolute_error: 0.5838
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 2.5295 - mean_absolute_error: 0.9245 - val_loss: 0.6352 - val_mean_absolute_error: 0.4615
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 2.4515 - mean_absolute_error: 1.0535 - val_loss: 0.8846 - val_mean_absolute_error: 0.5791
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.1124 - mean_absolute_error: 0.8324 - val_loss: 1.0860 - val_mean_absolute_error: 0.6896
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 6.4412 - mean_absolute_error: 1.3569 - val_loss: 1.2136 - val_mean_absolute_error: 0.7391
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - loss: 2.4347 - mean_absolute_error: 1.1660 - val_loss: 1.1866 - val_mean_absolute_error: 0.7435
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 4.4414 - mean_absolute_error: 1.3826 - val_loss: 1.0733 - val_mean_absolute_error: 0.7278
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 2.2942 - mean_absolute_error: 1.2012 - val_loss: 0.9719 - val_mean_absolute_error: 0.6608
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.8305 - mean_absolute_error: 1.0980 - val_loss: 0.7639 - val_mean_absolute_error: 0.5973
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.9895 - mean_absolute_error: 0.8124 - val_loss: 0.5822 - val_mean_absolute_error: 0.5157
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 6.4174 - mean_absolute_error: 1.5435 - val_loss: 0.7561 - val_mean_absolute_error: 0.4686
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 1.2808 - mean_absolute_error: 0.8016 - val_loss: 1.2698 - val_mean_absolute_error: 0.7161
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 3.8232 - mean_absolute_error: 1.3603 - val_loss: 2.0482 - val_mean_absolute_error: 1.0305
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 2.8421 - mean_absolute_error: 1.0650 - val_loss: 2.9002 - val_mean_absolute_error: 1.3126
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 67ms/step - loss: 3.0750 - mean_absolute_error: 1.2379 - val_loss: 3.5222 - val_mean_absolute_error: 1.4866
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 1.8148 - mean_absolute_error: 1.0704 - val_loss: 3.9300 - val_mean_absolute_error: 1.5899
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 2.5106 - mean_absolute_error: 1.3315 - val_loss: 3.9917 - val_mean_absolute_error: 1.6070
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 3.7471 - mean_absolute_error: 1.2837 - val_loss: 3.4889 - val_mean_absolute_error: 1.4538
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 1.9857 - mean_absolute_error: 1.1003 - val_loss: 2.6921 - val_mean_absolute_error: 1.1916
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 1.0760 - mean_absolute_error: 0.7963 - val_loss: 1.9388 - val_mean_absolute_error: 0.9563
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.7558 - mean_absolute_error: 0.6520 - val_loss: 1.4499 - val_mean_absolute_error: 0.8162
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 3.7041 - mean_absolute_error: 1.1635 - val_loss: 1.2868 - val_mean_absolute_error: 0.7503
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 3.0717 - mean_absolute_error: 1.2101 - val_loss: 1.4264 - val_mean_absolute_error: 0.7493
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 1.5287 - mean_absolute_error: 0.9400 - val_loss: 1.6892 - val_mean_absolute_error: 0.8272
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 2.0909 - mean_absolute_error: 0.9093 - val_loss: 1.9080 - val_mean_absolute_error: 0.9154
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 1.5780 - mean_absolute_error: 0.8921 - val_loss: 2.0069 - val_mean_absolute_error: 0.9441
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 2.9681 - mean_absolute_error: 1.1797 - val_loss: 2.0455 - val_mean_absolute_error: 0.9652
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 3.6576 - mean_absolute_error: 1.0332 - val_loss: 2.1485 - val_mean_absolute_error: 1.0157
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.1079 - mean_absolute_error: 0.7364 - val_loss: 2.1150 - val_mean_absolute_error: 1.0205
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.9791 - mean_absolute_error: 0.7535 - val_loss: 2.0569 - val_mean_absolute_error: 1.0320
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 2.1014 - mean_absolute_error: 0.9293 - val_loss: 1.9466 - val_mean_absolute_error: 1.0425
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 3.8308 - mean_absolute_error: 1.3033 - val_loss: 2.2246 - val_mean_absolute_error: 1.1462
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 1.3263 - mean_absolute_error: 0.8900 - val_loss: 2.4014 - val_mean_absolute_error: 1.2126
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 1.2851 - mean_absolute_error: 0.8051 - val_loss: 2.4951 - val_mean_absolute_error: 1.2625
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 2.3291 - mean_absolute_error: 1.0615 - val_loss: 2.5405 - val_mean_absolute_error: 1.2884
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 1.0161 - mean_absolute_error: 0.7637 - val_loss: 2.2704 - val_mean_absolute_error: 1.2061
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 96ms/step - loss: 2.8258 - mean_absolute_error: 1.0517 - val_loss: 1.9505 - val_mean_absolute_error: 1.1020
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 2.9600 - mean_absolute_error: 1.2350 - val_loss: 1.7381 - val_mean_absolute_error: 1.0024
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 2.5306 - mean_absolute_error: 0.7724 - val_loss: 1.5201 - val_mean_absolute_error: 0.8810
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 2.3128 - mean_absolute_error: 1.0830 - val_loss: 1.4670 - val_mean_absolute_error: 0.8331
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 1.4242 - mean_absolute_error: 0.9180 - val_loss: 1.4216 - val_mean_absolute_error: 0.8029
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 67ms/step - loss: 2.9618 - mean_absolute_error: 0.7694 - val_loss: 1.3511 - val_mean_absolute_error: 0.7586
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 4.0797 - mean_absolute_error: 1.1434 - val_loss: 1.2000 - val_mean_absolute_error: 0.6795
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 1.5979 - mean_absolute_error: 0.8700 - val_loss: 1.2032 - val_mean_absolute_error: 0.6705
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 1.7351 - mean_absolute_error: 0.9952 - val_loss: 1.1539 - val_mean_absolute_error: 0.6536
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 0.8217 - mean_absolute_error: 0.7041 - val_loss: 1.1572 - val_mean_absolute_error: 0.6860
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 1.3566 - mean_absolute_error: 0.8237 - val_loss: 1.2901 - val_mean_absolute_error: 0.7748
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 2.7707 - mean_absolute_error: 1.0651 - val_loss: 1.4093 - val_mean_absolute_error: 0.8646

FOLD 4
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 66ms/step - loss: 4.8672 - mean_absolute_error: 1.2169 - val_loss: 0.3839 - val_mean_absolute_error: 0.4315
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 1.2852 - mean_absolute_error: 0.8763 - val_loss: 0.3506 - val_mean_absolute_error: 0.4337
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 4.1332 - mean_absolute_error: 1.2580 - val_loss: 0.3394 - val_mean_absolute_error: 0.4494
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 3.4643 - mean_absolute_error: 1.1320 - val_loss: 0.3162 - val_mean_absolute_error: 0.4401
Epoch 5/50
2025-08-09 16:36:18.325756: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
2025-08-09 16:36:18.326136: E tensorflow/core/framework/node_def_util.cc:676] NodeDef mentions attribute use_unbounded_threadpool which is not in the op definition: Op<name=MapDataset; signature=input_dataset:variant, other_arguments: -> handle:variant; attr=f:func; attr=Targuments:list(type),min=0; attr=output_types:list(type),min=1; attr=output_shapes:list(shape),min=1; attr=use_inter_op_parallelism:bool,default=true; attr=preserve_cardinality:bool,default=false; attr=force_synchronous:bool,default=false; attr=metadata:string,default=""> This may be expected if your graph generating binary is newer  than this binary. Unknown attributes will be ignored. NodeDef: {{node ParallelMapDatasetV2/_16}}
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - loss: 4.1881 - mean_absolute_error: 1.1362 - val_loss: 0.2807 - val_mean_absolute_error: 0.4081
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.7350 - mean_absolute_error: 0.6764 - val_loss: 0.2448 - val_mean_absolute_error: 0.3659
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 3.3476 - mean_absolute_error: 1.0775 - val_loss: 0.2576 - val_mean_absolute_error: 0.4378
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 0.9191 - mean_absolute_error: 0.8009 - val_loss: 0.3059 - val_mean_absolute_error: 0.4990
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 1.4469 - mean_absolute_error: 0.9682 - val_loss: 0.3845 - val_mean_absolute_error: 0.5664
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.3649 - mean_absolute_error: 0.9586 - val_loss: 0.4092 - val_mean_absolute_error: 0.5849
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.8506 - mean_absolute_error: 1.0394 - val_loss: 0.4190 - val_mean_absolute_error: 0.5900
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 53ms/step - loss: 1.1227 - mean_absolute_error: 0.8345 - val_loss: 0.4164 - val_mean_absolute_error: 0.5849
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 28.9153 - mean_absolute_error: 2.3895 - val_loss: 0.2838 - val_mean_absolute_error: 0.4808
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 2.7381 - mean_absolute_error: 1.0676 - val_loss: 0.3196 - val_mean_absolute_error: 0.3750
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.4299 - mean_absolute_error: 0.9437 - val_loss: 0.4399 - val_mean_absolute_error: 0.3953
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 2.9291 - mean_absolute_error: 1.1550 - val_loss: 0.4782 - val_mean_absolute_error: 0.4374
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 4.7653 - mean_absolute_error: 1.3856 - val_loss: 0.4399 - val_mean_absolute_error: 0.4245
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 4.5475 - mean_absolute_error: 1.2443 - val_loss: 0.4166 - val_mean_absolute_error: 0.4545
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 5.1505 - mean_absolute_error: 1.5380 - val_loss: 0.4517 - val_mean_absolute_error: 0.5863
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 2.2238 - mean_absolute_error: 0.9877 - val_loss: 0.5745 - val_mean_absolute_error: 0.7227
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 5.6401 - mean_absolute_error: 1.3386 - val_loss: 0.6863 - val_mean_absolute_error: 0.7949
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 75ms/step - loss: 1.3671 - mean_absolute_error: 0.8848 - val_loss: 0.7549 - val_mean_absolute_error: 0.8347
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 2.6850 - mean_absolute_error: 1.1925 - val_loss: 0.7858 - val_mean_absolute_error: 0.8538
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 5.5378 - mean_absolute_error: 1.3763 - val_loss: 0.7942 - val_mean_absolute_error: 0.8608
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 1.5250 - mean_absolute_error: 1.0449 - val_loss: 0.7560 - val_mean_absolute_error: 0.8399
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.5365 - mean_absolute_error: 0.9634 - val_loss: 0.6834 - val_mean_absolute_error: 0.7875
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 1.3355 - mean_absolute_error: 0.9581 - val_loss: 0.6192 - val_mean_absolute_error: 0.7363
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.7047 - mean_absolute_error: 0.5791 - val_loss: 0.5650 - val_mean_absolute_error: 0.6754
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 2.3183 - mean_absolute_error: 0.9975 - val_loss: 0.5576 - val_mean_absolute_error: 0.6534
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 4.4666 - mean_absolute_error: 1.2680 - val_loss: 0.5568 - val_mean_absolute_error: 0.6541
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 6.0519 - mean_absolute_error: 1.2716 - val_loss: 0.5750 - val_mean_absolute_error: 0.6422
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.1027 - mean_absolute_error: 0.7477 - val_loss: 0.5867 - val_mean_absolute_error: 0.6219
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.3673 - mean_absolute_error: 0.4922 - val_loss: 0.5976 - val_mean_absolute_error: 0.6002
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 4.0413 - mean_absolute_error: 1.0862 - val_loss: 0.5990 - val_mean_absolute_error: 0.5814
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.8040 - mean_absolute_error: 0.6723 - val_loss: 0.5944 - val_mean_absolute_error: 0.5693
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 3.2522 - mean_absolute_error: 0.9471 - val_loss: 0.5841 - val_mean_absolute_error: 0.5567
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 4.2609 - mean_absolute_error: 1.0150 - val_loss: 0.5566 - val_mean_absolute_error: 0.5273
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 1.0078 - mean_absolute_error: 0.7256 - val_loss: 0.5277 - val_mean_absolute_error: 0.4961
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - loss: 4.3260 - mean_absolute_error: 1.1903 - val_loss: 0.4903 - val_mean_absolute_error: 0.4600
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.9498 - mean_absolute_error: 0.9642 - val_loss: 0.4668 - val_mean_absolute_error: 0.4781
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 3.2530 - mean_absolute_error: 1.1164 - val_loss: 0.4622 - val_mean_absolute_error: 0.5053
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 4.9693 - mean_absolute_error: 0.9622 - val_loss: 0.4584 - val_mean_absolute_error: 0.5316
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.4680 - mean_absolute_error: 0.5608 - val_loss: 0.4670 - val_mean_absolute_error: 0.5596
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.7148 - mean_absolute_error: 0.6059 - val_loss: 0.4748 - val_mean_absolute_error: 0.5786
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 74ms/step - loss: 2.0448 - mean_absolute_error: 0.7415 - val_loss: 0.5006 - val_mean_absolute_error: 0.6054
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.4061 - mean_absolute_error: 0.7287 - val_loss: 0.5173 - val_mean_absolute_error: 0.6100
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.9160 - mean_absolute_error: 0.6954 - val_loss: 0.5283 - val_mean_absolute_error: 0.6041
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.6188 - mean_absolute_error: 0.5872 - val_loss: 0.5514 - val_mean_absolute_error: 0.6406
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 1.6089 - mean_absolute_error: 0.8794 - val_loss: 0.6044 - val_mean_absolute_error: 0.6774
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.2718 - mean_absolute_error: 0.7000 - val_loss: 0.6391 - val_mean_absolute_error: 0.6911

FOLD 5
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 69ms/step - loss: 2.2608 - mean_absolute_error: 0.9637 - val_loss: 0.0428 - val_mean_absolute_error: 0.1369
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - loss: 0.6094 - mean_absolute_error: 0.6637 - val_loss: 0.0581 - val_mean_absolute_error: 0.1904
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - loss: 1.9865 - mean_absolute_error: 0.8809 - val_loss: 0.0630 - val_mean_absolute_error: 0.2041
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 56ms/step - loss: 1.6929 - mean_absolute_error: 0.7477 - val_loss: 0.0595 - val_mean_absolute_error: 0.1886
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 1.4586 - mean_absolute_error: 0.7813 - val_loss: 0.0626 - val_mean_absolute_error: 0.1797
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 0.8068 - mean_absolute_error: 0.7054 - val_loss: 0.0714 - val_mean_absolute_error: 0.2154
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 2.4013 - mean_absolute_error: 1.0872 - val_loss: 0.3458 - val_mean_absolute_error: 0.4006
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 0.4411 - mean_absolute_error: 0.5438 - val_loss: 0.8494 - val_mean_absolute_error: 0.6216
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 103ms/step - loss: 2.4708 - mean_absolute_error: 0.9338 - val_loss: 1.2660 - val_mean_absolute_error: 0.7529
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 4.4159 - mean_absolute_error: 1.1743 - val_loss: 1.4773 - val_mean_absolute_error: 0.8104
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 2.5681 - mean_absolute_error: 1.1015 - val_loss: 1.5033 - val_mean_absolute_error: 0.8183
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 3.9737 - mean_absolute_error: 1.1318 - val_loss: 1.3614 - val_mean_absolute_error: 0.7843
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.9973 - mean_absolute_error: 1.0560 - val_loss: 1.0701 - val_mean_absolute_error: 0.7444
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 2.2139 - mean_absolute_error: 0.9026 - val_loss: 0.7097 - val_mean_absolute_error: 0.6791
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 1.0663 - mean_absolute_error: 0.7438 - val_loss: 0.4140 - val_mean_absolute_error: 0.5902
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 2.7608 - mean_absolute_error: 0.7654 - val_loss: 0.3320 - val_mean_absolute_error: 0.5326
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 2.7143 - mean_absolute_error: 1.1507 - val_loss: 0.3413 - val_mean_absolute_error: 0.5396
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.3587 - mean_absolute_error: 0.4885 - val_loss: 0.3466 - val_mean_absolute_error: 0.5439
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - loss: 17.8572 - mean_absolute_error: 1.8143 - val_loss: 0.3818 - val_mean_absolute_error: 0.5901
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 2.1195 - mean_absolute_error: 0.9546 - val_loss: 0.3935 - val_mean_absolute_error: 0.6043
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 1.8787 - mean_absolute_error: 0.8807 - val_loss: 0.3542 - val_mean_absolute_error: 0.5743
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 74ms/step - loss: 1.3920 - mean_absolute_error: 0.8051 - val_loss: 0.3813 - val_mean_absolute_error: 0.5821
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - loss: 1.4401 - mean_absolute_error: 0.8519 - val_loss: 0.3830 - val_mean_absolute_error: 0.5502
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 1.1497 - mean_absolute_error: 0.7268 - val_loss: 0.4181 - val_mean_absolute_error: 0.5210
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 1.9423 - mean_absolute_error: 0.8343 - val_loss: 0.5220 - val_mean_absolute_error: 0.5299
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 6.4891 - mean_absolute_error: 1.4082 - val_loss: 0.5194 - val_mean_absolute_error: 0.5033
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 1.2483 - mean_absolute_error: 0.8297 - val_loss: 0.4124 - val_mean_absolute_error: 0.4446
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 4.5298 - mean_absolute_error: 1.0868 - val_loss: 0.2845 - val_mean_absolute_error: 0.3738
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 6.7258 - mean_absolute_error: 1.2549 - val_loss: 0.1323 - val_mean_absolute_error: 0.2675
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - loss: 5.1833 - mean_absolute_error: 1.1443 - val_loss: 0.0552 - val_mean_absolute_error: 0.1582
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - loss: 0.5000 - mean_absolute_error: 0.5959 - val_loss: 0.0463 - val_mean_absolute_error: 0.0974
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 1.8510 - mean_absolute_error: 0.9714 - val_loss: 0.0574 - val_mean_absolute_error: 0.1618
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - loss: 1.4178 - mean_absolute_error: 0.7944 - val_loss: 0.0839 - val_mean_absolute_error: 0.2346
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 1.5391 - mean_absolute_error: 0.7568 - val_loss: 0.0978 - val_mean_absolute_error: 0.2633
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 4.4676 - mean_absolute_error: 1.0053 - val_loss: 0.1198 - val_mean_absolute_error: 0.3007
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 5.1447 - mean_absolute_error: 0.9243 - val_loss: 0.1749 - val_mean_absolute_error: 0.3698
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.8924 - mean_absolute_error: 0.6349 - val_loss: 0.2051 - val_mean_absolute_error: 0.4054
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 0.6055 - mean_absolute_error: 0.5744 - val_loss: 0.2259 - val_mean_absolute_error: 0.4294
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 0.3950 - mean_absolute_error: 0.5072 - val_loss: 0.2370 - val_mean_absolute_error: 0.4435
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 3.5463 - mean_absolute_error: 0.9509 - val_loss: 0.2169 - val_mean_absolute_error: 0.4268
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - loss: 2.2618 - mean_absolute_error: 0.8409 - val_loss: 0.1764 - val_mean_absolute_error: 0.3706
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 1.2536 - mean_absolute_error: 0.8287 - val_loss: 0.1706 - val_mean_absolute_error: 0.3549
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - loss: 2.7684 - mean_absolute_error: 0.9332 - val_loss: 0.2563 - val_mean_absolute_error: 0.4678
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - loss: 6.6343 - mean_absolute_error: 1.1043 - val_loss: 0.3892 - val_mean_absolute_error: 0.5590
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - loss: 2.5204 - mean_absolute_error: 0.8538 - val_loss: 0.4710 - val_mean_absolute_error: 0.6015
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 52ms/step - loss: 0.2742 - mean_absolute_error: 0.4082 - val_loss: 0.5494 - val_mean_absolute_error: 0.6391
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 51ms/step - loss: 0.8573 - mean_absolute_error: 0.6987 - val_loss: 0.6068 - val_mean_absolute_error: 0.6603
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.9015 - mean_absolute_error: 0.6649 - val_loss: 0.5694 - val_mean_absolute_error: 0.6346
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 75ms/step - loss: 1.2086 - mean_absolute_error: 0.6615 - val_loss: 0.4781 - val_mean_absolute_error: 0.5840
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - loss: 0.4620 - mean_absolute_error: 0.4660 - val_loss: 0.4041 - val_mean_absolute_error: 0.5367
Validation losses: [38.398555755615234, 14.612112045288086, 1.4093397855758667, 0.6390684843063354, 0.40413007140159607]
HPS: {'player_emb_dim': 32, 'dense_units': 16, 'dense_units_2': 128, 'learning_rate': 0.01, 'dropout_rate': 0.1, 'dropout_rate_2': 0.30000000000000004, 'dropout_rate_inter': 0.1, 'interaction_scale': 2} Avg. across folds score(MSE): 8.38918333053589
HPS: {'player_emb_dim': 32, 'dense_units': 64, 'dense_units_2': 16, 'learning_rate': 0.01, 'dropout_rate': 0.4, 'dropout_rate_2': 0.2, 'dropout_rate_inter': 0.4, 'interaction_scale': 4} Avg. across folds score(MSE): 10.185241138935089
HPS: {'player_emb_dim': 32, 'dense_units': 128, 'dense_units_2': 16, 'learning_rate': 0.0001, 'dropout_rate': 0.2, 'dropout_rate_2': 0.2, 'dropout_rate_inter': 0.1, 'interaction_scale': 4} Avg. across folds score(MSE): 8.80036201775074
HPS: {'player_emb_dim': 32, 'dense_units': 48, 'dense_units_2': 96, 'learning_rate': 0.01, 'dropout_rate': 0.4, 'dropout_rate_2': 0.30000000000000004, 'dropout_rate_inter': 0.2, 'interaction_scale': 4} Avg. across folds score(MSE): 9.610527980327607
HPS: {'player_emb_dim': 32, 'dense_units': 80, 'dense_units_2': 48, 'learning_rate': 0.01, 'dropout_rate': 0.4, 'dropout_rate_2': 0.30000000000000004, 'dropout_rate_inter': 0.30000000000000004, 'interaction_scale': 2} Avg. across folds score(MSE): 11.092641228437424
HPS: {'player_emb_dim': 32, 'dense_units': 80, 'dense_units_2': 48, 'learning_rate': 0.01, 'dropout_rate': 0.4, 'dropout_rate_2': 0.30000000000000004, 'dropout_rate_inter': 0.30000000000000004, 'interaction_scale': 2}. Avg MSE: 11.092641228437424.
Epoch 1/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 1s 1s/step - loss: 11.9406 - mean_absolute_error: 2.3698
Epoch 2/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 9.7077 - mean_absolute_error: 2.0718
Epoch 3/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 10.6276 - mean_absolute_error: 2.2072
Epoch 4/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 8.3413 - mean_absolute_error: 2.1662
Epoch 5/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - loss: 9.0744 - mean_absolute_error: 2.0866
Epoch 6/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 8.3995 - mean_absolute_error: 1.8694
Epoch 7/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - loss: 7.9003 - mean_absolute_error: 1.9455
Epoch 8/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 6.9372 - mean_absolute_error: 1.9037
Epoch 9/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 8.2478 - mean_absolute_error: 1.8541
Epoch 10/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - loss: 8.0675 - mean_absolute_error: 2.1305
Epoch 11/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 4.7517 - mean_absolute_error: 1.7300
Epoch 12/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - loss: 3.4960 - mean_absolute_error: 1.5295
Epoch 13/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 3.7890 - mean_absolute_error: 1.5116
Epoch 14/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 2.9784 - mean_absolute_error: 1.3586
Epoch 15/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 2.2312 - mean_absolute_error: 1.2432
Epoch 16/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 2.0622 - mean_absolute_error: 1.2109
Epoch 17/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 26ms/step - loss: 2.7377 - mean_absolute_error: 1.0611
Epoch 18/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 2.3021 - mean_absolute_error: 1.1763
Epoch 19/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - loss: 2.5120 - mean_absolute_error: 1.2915
Epoch 20/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - loss: 2.8933 - mean_absolute_error: 1.3541
Epoch 21/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 5.0702 - mean_absolute_error: 1.5527
Epoch 22/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 2.8920 - mean_absolute_error: 1.2002
Epoch 23/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 2.6001 - mean_absolute_error: 1.2054
Epoch 24/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - loss: 2.0189 - mean_absolute_error: 1.0086
Epoch 25/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 2.9486 - mean_absolute_error: 1.3327
Epoch 26/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 102ms/step - loss: 4.1954 - mean_absolute_error: 1.3611
Epoch 27/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 1.2447 - mean_absolute_error: 0.8315
Epoch 28/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 1.3350 - mean_absolute_error: 0.9197
Epoch 29/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - loss: 0.8410 - mean_absolute_error: 0.7624
Epoch 30/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 2.8354 - mean_absolute_error: 1.4566
Epoch 31/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 4.0363 - mean_absolute_error: 1.2248
Epoch 32/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 1.3878 - mean_absolute_error: 0.8435
Epoch 33/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 1.9936 - mean_absolute_error: 1.0518
Epoch 34/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 3.8689 - mean_absolute_error: 1.1044
Epoch 35/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 4.0422 - mean_absolute_error: 1.3377
Epoch 36/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 2.3729 - mean_absolute_error: 1.1632
Epoch 37/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - loss: 4.3870 - mean_absolute_error: 1.1235
Epoch 38/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.3527 - mean_absolute_error: 0.4645
Epoch 39/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 3.1023 - mean_absolute_error: 1.1642
Epoch 40/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.8429 - mean_absolute_error: 0.7730
Epoch 41/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - loss: 0.8913 - mean_absolute_error: 0.7585
Epoch 42/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 1.4760 - mean_absolute_error: 0.8091
Epoch 43/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - loss: 1.0462 - mean_absolute_error: 0.6550
Epoch 44/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - loss: 3.6721 - mean_absolute_error: 0.8918
Epoch 45/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 1.2109 - mean_absolute_error: 0.8323
Epoch 46/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.9282 - mean_absolute_error: 0.7407
Epoch 47/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.7325 - mean_absolute_error: 0.6568
Epoch 48/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 0.7363 - mean_absolute_error: 0.6100
Epoch 49/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 2.0520 - mean_absolute_error: 0.9048
Epoch 50/50
1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - loss: 2.5421 - mean_absolute_error: 0.9848
marker_labels = [f'{name}' for name in team_members_with_ids.values()]
fig, _, _ = analyze_players_embeddings(model_inter_real, player_strengths_estimates, marker_labels)
fig
player_strengths.shape: (31,)
embeddings_nd[:, 0].shape : (31,)
Embeddings shape: (31, 16)
Dimension 1 correlation with base strengths: r = 0.1566, p-value = 0.4002
Dimension 2 correlation with base strengths: r = 0.0081, p-value = 0.9655
Dimension 3 correlation with base strengths: r = -0.1714, p-value = 0.3565
Dimension 4 correlation with base strengths: r = 0.0225, p-value = 0.9043
Dimension 5 correlation with base strengths: r = 0.1697, p-value = 0.3615
Dimension 6 correlation with base strengths: r = 0.0804, p-value = 0.6671
Dimension 7 correlation with base strengths: r = -0.0271, p-value = 0.8847
Dimension 8 correlation with base strengths: r = -0.0504, p-value = 0.7878
Dimension 9 correlation with base strengths: r = -0.0992, p-value = 0.5955
Dimension 10 correlation with base strengths: r = 0.0575, p-value = 0.7585
Dimension 11 correlation with base strengths: r = 0.0751, p-value = 0.6882
Dimension 12 correlation with base strengths: r = -0.1122, p-value = 0.5479
Dimension 13 correlation with base strengths: r = -0.0159, p-value = 0.9322
Dimension 14 correlation with base strengths: r = 0.1111, p-value = 0.5519
Dimension 15 correlation with base strengths: r = 0.1766, p-value = 0.3421
Dimension 16 correlation with base strengths: r = -0.0356, p-value = 0.8492
Average absolute correlation across 16 components: 0.0856
player_strengths.shape: (31,)
embeddings_nd[:, 0].shape : (31,)
Embeddings shape: (31, 3)
Dimension 1 correlation with base strengths: r = -0.0015, p-value = 0.9936
Dimension 2 correlation with base strengths: r = -0.0718, p-value = 0.7012
Dimension 3 correlation with base strengths: r = -0.1200, p-value = 0.5201
Average absolute correlation across 3 components: 0.0644

TODO#

  • we need to change how masking is done. It is incorrect to use array of zeroes as masing value

In Keras, when you use the Masking layer to specify a padded value (e.g., zeros) to be masked, the masking works at the timestep level (sequence level), but it requires that all features in a timestep equal the specified mask value for that timestep to be masked (ignored).

What this means for an array of zeros as padded value: If you specify mask_value=0.0 in keras.layers.Masking, then a timestep is masked only if all features in that timestep are exactly zero.

If your padded timestep looks like an array of zeros (e.g., [0.0, 0.0, …, 0.0] for all features), Keras will mask that timestep.

But if any feature in a timestep is non-zero, that timestep is not masked, even if other features are zero.