So habe in der smb. conf mal log level 3 eingetragen und das Log während des Backups beobachtet.. Veeam schaffte ca. 60gb bis er dann abbrach. Während des Abbruchs wurde folgendes geloggt:
Code:
[2018/07/07 11:20:42.326343, 3] ../source3/smbd/smb2_write.c:212(smb2_write_complete_internal)
smb2: fnum 2843473916, file Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS22018-07-07T100908.vbk, length=65536 offset=0 wrote=65536
[2018/07/07 11:20:42.326901, 3] ../source3/smbd/smb2_write.c:212(smb2_write_complete_internal)
smb2: fnum 2843473916, file Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS22018-07-07T100908.vbk, length=65536 offset=0 wrote=65536
[2018/07/07 11:20:42.327499, 3] ../source3/smbd/smb2_write.c:212(smb2_write_complete_internal)
smb2: fnum 2843473916, file Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS22018-07-07T100908.vbk, length=65536 offset=0 wrote=65536
[2018/07/07 11:21:45.263543, 3] ../source3/smbd/smb2_write.c:212(smb2_write_complete_internal)
smb2: fnum 2843473916, file Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS22018-07-07T100908.vbk, length=65536 offset=0 wrote=65536
[2018/07/07 11:21:45.263726, 2] ../source3/smbd/close.c:789(close_normal_file)
veeam closed file Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS22018-07-07T100908.vbk (numopen=0) NT_STATUS_OK
[2018/07/07 11:21:45.263778, 2] ../source3/smbd/service.c:1120(close_cnum)
srvdc1 (ipv4:172.17.37.3:56181) closed connection to service Backup
[2018/07/07 11:21:45.267099, 3] ../source3/smbd/server_exit.c:248(exit_server_common)
Server exit (NT_STATUS_CONNECTION_DISCONNECTED)
[2018/07/07 11:21:54.453688, 3] ../source3/smbd/oplock.c:1329(init_oplocks)
init_oplocks: initializing messages.
[2018/07/07 11:21:54.453737, 3] ../source3/smbd/process.c:1959(process_smb)
Transaction 0 of length 108 (0 toread)
[2018/07/07 11:21:54.454012, 3] ../source3/smbd/smb2_negprot.c:290(smbd_smb2_request_process_negprot)
Selected protocol SMB2_10
[2018/07/07 11:21:54.454612, 3] ../auth/gensec/gensec_start.c:977(gensec_register)
GENSEC backend 'gssapi_spnego' registered
[2018/07/07 11:21:54.454622, 3] ../auth/gensec/gensec_start.c:977(gensec_register)
GENSEC backend 'gssapi_krb5' registered
[2018/07/07 11:21:54.454629, 3] ../auth/gensec/gensec_start.c:977(gensec_register)
GENSEC backend 'gssapi_krb5_sasl' registered
[2018/07/07 11:21:54.454635, 3] ../auth/gensec/gensec_start.c:977(gensec_register)
GENSEC backend 'spnego' registered
[2018/07/07 11:21:54.454642, 3] ../auth/gensec/gensec_start.c:977(gensec_register)
GENSEC backend 'schannel' registered
[2018/07/07 11:21:54.454648, 3] ../auth/gensec/gensec_start.c:977(gensec_register)
GENSEC backend 'naclrpc_as_system' registered
[2018/07/07 11:21:54.454655, 3] ../auth/gensec/gensec_start.c:977(gensec_register)
GENSEC backend 'sasl-EXTERNAL' registered
[2018/07/07 11:21:54.454665, 3] ../auth/gensec/gensec_start.c:977(gensec_register)
GENSEC backend 'ntlmssp' registered
[2018/07/07 11:21:54.454672, 3] ../auth/gensec/gensec_start.c:977(gensec_register)
GENSEC backend 'ntlmssp_resume_ccache' registered
[2018/07/07 11:21:54.454680, 3] ../auth/gensec/gensec_start.c:977(gensec_register)
GENSEC backend 'http_basic' registered
[2018/07/07 11:21:54.454686, 3] ../auth/gensec/gensec_start.c:977(gensec_register)
GENSEC backend 'http_ntlm' registered
[2018/07/07 11:21:54.454693, 3] ../auth/gensec/gensec_start.c:977(gensec_register)
GENSEC backend 'krb5' registered
[2018/07/07 11:21:54.454701, 3] ../auth/gensec/gensec_start.c:977(gensec_register)
GENSEC backend 'fake_gssapi_krb5' registered
[2018/07/07 11:21:54.455791, 3] ../auth/ntlmssp/ntlmssp_util.c:69(debug_ntlmssp_flags)
Got NTLMSSP neg_flags=0xe2088297
[2018/07/07 11:21:54.456478, 3] ../auth/ntlmssp/ntlmssp_server.c:454(ntlmssp_server_preauth)
Got user=[veeam] domain=[FWTT] workstation=[SRVDC1] len1=24 len2=324
[2018/07/07 11:21:54.456513, 3] ../source3/param/loadparm.c:3856(lp_load_ex)
lp_load_ex: refreshing parameters
[2018/07/07 11:21:54.456564, 3] ../source3/param/loadparm.c:543(init_globals)
Initialising global parameters
[2018/07/07 11:21:54.456651, 3] ../source3/param/loadparm.c:2770(lp_do_section)
Processing section "[global]"
[2018/07/07 11:21:54.457237, 2] ../source3/param/loadparm.c:2787(lp_do_section)
Processing section "[Backup]"
[2018/07/07 11:21:54.457464, 3] ../source3/param/loadparm.c:1598(lp_add_ipc)
adding IPC service
[2018/07/07 11:21:54.457480, 3] ../source3/auth/auth.c:189(auth_check_ntlm_password)
check_ntlm_password: Checking password for unmapped user [FWTT]\[veeam]@[SRVDC1] with the new password interface
[2018/07/07 11:21:54.457487, 3] ../source3/auth/auth.c:192(auth_check_ntlm_password)
check_ntlm_password: mapped user is: [FWTT]\[veeam]@[SRVDC1]
[2018/07/07 11:21:54.458043, 3] ../source3/passdb/lookup_sid.c:1680(get_primary_group_sid)
Forcing Primary Group to 'Domain Users' for veeam
[2018/07/07 11:21:54.458267, 3] ../source3/auth/auth.c:256(auth_check_ntlm_password)
auth_check_ntlm_password: sam_ignoredomain authentication for user [veeam] succeeded
[2018/07/07 11:21:54.460860, 3] ../auth/auth_log.c:760(log_authentication_event_human_readable)
Auth: [SMB2,(null)] user [FWTT]\[veeam] at [Sat, 07 Jul 2018 11:21:54.460847 CEST] with [NTLMv2] status [NT_STATUS_OK] workstation [SRVDC1] remote host [ipv4:172.17.37.3:63052] became [FWWKBACKUPZFS]\[veeam] [S-1-5-21-3982563097-2396922467-1248689714-3000]. local host [ipv4:172.17.37.233:445]
[2018/07/07 11:21:54.460879, 3] ../auth/auth_log.c:591(log_no_json)
log_no_json: JSON auth logs not available unless compiled with jansson
[2018/07/07 11:21:54.460885, 2] ../source3/auth/auth.c:314(auth_check_ntlm_password)
check_ntlm_password: authentication for user [veeam] -> [veeam] -> [veeam] succeeded
[2018/07/07 11:21:54.461036, 3] ../source3/auth/token_util.c:559(finalize_local_nt_token)
Failed to fetch domain sid for FWTT
[2018/07/07 11:21:54.461061, 3] ../source3/auth/token_util.c:591(finalize_local_nt_token)
Failed to fetch domain sid for FWTT
[2018/07/07 11:21:54.479762, 3] ../auth/ntlmssp/ntlmssp_sign.c:509(ntlmssp_sign_reset)
NTLMSSP Sign/Seal - Initialising with flags:
[2018/07/07 11:21:54.479772, 3] ../auth/ntlmssp/ntlmssp_util.c:69(debug_ntlmssp_flags)
Got NTLMSSP neg_flags=0xe2088215
[2018/07/07 11:21:54.479801, 3] ../auth/ntlmssp/ntlmssp_sign.c:509(ntlmssp_sign_reset)
NTLMSSP Sign/Seal - Initialising with flags:
[2018/07/07 11:21:54.479806, 3] ../auth/ntlmssp/ntlmssp_util.c:69(debug_ntlmssp_flags)
Got NTLMSSP neg_flags=0xe2088215
[2018/07/07 11:21:54.479990, 3] ../source3/auth/token_util.c:559(finalize_local_nt_token)
Failed to fetch domain sid for FWTT
[2018/07/07 11:21:54.480014, 3] ../source3/auth/token_util.c:591(finalize_local_nt_token)
Failed to fetch domain sid for FWTT
[2018/07/07 11:21:54.480117, 3] ../source3/smbd/password.c:144(register_homes_share)
Adding homes service for user 'veeam' using home directory: '/nonexistent'
[2018/07/07 11:21:54.483103, 3] ../lib/util/access.c:361(allow_access)
Allowed connection from srvdc1.fwtt.local (172.17.37.3)
[2018/07/07 11:21:54.483154, 3] ../source3/smbd/service.c:595(make_connection_snum)
Connect path is '/tmp' for service [IPC$]
[2018/07/07 11:21:54.483175, 3] ../source3/smbd/vfs.c:113(vfs_init_default)
Initialising default vfs hooks
[2018/07/07 11:21:54.483186, 3] ../source3/smbd/vfs.c:139(vfs_init_custom)
Initialising custom vfs hooks from [/[Default VFS]/]
[2018/07/07 11:21:54.483290, 3] ../source3/smbd/service.c:841(make_connection_snum)
srvdc1 (ipv4:172.17.37.3:63052) connect to service IPC$ initially as user veeam (uid=1000, gid=1000) (pid 90876)
[2018/07/07 11:21:54.483936, 3] ../source3/smbd/msdfs.c:1008(get_referred_path)
get_referred_path: |Backup| in dfs path \172.17.37.233\Backup is not a dfs root.
[2018/07/07 11:21:54.483949, 3] ../source3/smbd/smb2_server.c:3115(smbd_smb2_request_error_ex)
smbd_smb2_request_error_ex: smbd_smb2_request_error_ex: idx[1] status[NT_STATUS_NOT_FOUND] || at ../source3/smbd/smb2_ioctl.c:309
[2018/07/07 11:21:54.484568, 3] ../lib/util/access.c:361(allow_access)
Allowed connection from srvdc1.fwtt.local (172.17.37.3)
[2018/07/07 11:21:54.484605, 3] ../source3/smbd/service.c:595(make_connection_snum)
Connect path is '/mnt/PoolFWWK/BackupFWWK' for service [Backup]
[2018/07/07 11:21:54.484625, 3] ../source3/smbd/vfs.c:113(vfs_init_default)
Initialising default vfs hooks
[2018/07/07 11:21:54.484631, 3] ../source3/smbd/vfs.c:139(vfs_init_custom)
Initialising custom vfs hooks from [/[Default VFS]/]
[2018/07/07 11:21:54.484640, 3] ../source3/smbd/vfs.c:139(vfs_init_custom)
Initialising custom vfs hooks from [streams_xattr]
[2018/07/07 11:21:54.484857, 3] ../lib/util/modules.c:167(load_module_absolute_path)
load_module_absolute_path: Module '/usr/local/lib/shared-modules/vfs/streams_xattr.so' loaded
[2018/07/07 11:21:54.484871, 3] ../source3/smbd/vfs.c:139(vfs_init_custom)
Initialising custom vfs hooks from [zfsacl]
[2018/07/07 11:21:54.485157, 3] ../lib/util/modules.c:167(load_module_absolute_path)
load_module_absolute_path: Module '/usr/local/lib/shared-modules/vfs/zfsacl.so' loaded
[2018/07/07 11:21:54.485170, 3] ../source3/smbd/vfs.c:139(vfs_init_custom)
Initialising custom vfs hooks from [zfs_space]
[2018/07/07 11:21:54.493624, 3] ../lib/util/modules.c:167(load_module_absolute_path)
load_module_absolute_path: Module '/usr/local/lib/shared-modules/vfs/zfs_space.so' loaded
[2018/07/07 11:21:54.493762, 2] ../source3/smbd/service.c:841(make_connection_snum)
srvdc1 (ipv4:172.17.37.3:63052) connect to service Backup initially as user veeam (uid=1000, gid=1000) (pid 90876)
[2018/07/07 11:21:54.495179, 3] ../source3/smbd/dir.c:657(dptr_create)
creating new dirptr 0 for path Veeam, expect_close = 0
[2018/07/07 11:21:54.495233, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/. fname=. (.)
[2018/07/07 11:21:54.495324, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/.. fname=.. (..)
[2018/07/07 11:21:54.495385, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/VeeamRecoveryMedia_SRVFS2.iso fname=VeeamRecoveryMedia_SRVFS2.iso (VeeamRecoveryMedia_SRVFS2.iso)
[2018/07/07 11:21:54.495452, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/Aktive VMs Linux fname=Aktive VMs Linux (Aktive VMs Linux)
[2018/07/07 11:21:54.495517, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/Aktive VMs fname=Aktive VMs (Aktive VMs)
[2018/07/07 11:21:54.495578, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/Inaktive VMs fname=Inaktive VMs (Inaktive VMs)
[2018/07/07 11:21:54.495667, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/FWTT_veeamagent fname=FWTT_veeamagent (FWTT_veeamagent)
[2018/07/07 11:21:54.495730, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/VeeamRecoveryMedia_SRVDC1.iso fname=VeeamRecoveryMedia_SRVDC1.iso (VeeamRecoveryMedia_SRVDC1.iso)
[2018/07/07 11:21:54.495782, 3] ../source3/smbd/smb2_server.c:3115(smbd_smb2_request_error_ex)
smbd_smb2_request_error_ex: smbd_smb2_request_error_ex: idx[5] status[STATUS_NO_MORE_FILES] || at ../source3/smbd/smb2_query_directory.c:155
[2018/07/07 11:21:54.497428, 3] ../source3/smbd/trans2.c:3456(smbd_do_qfsinfo)
smbd_do_qfsinfo: level = 1001
[2018/07/07 11:21:54.497455, 3] ../source3/smbd/trans2.c:3456(smbd_do_qfsinfo)
smbd_do_qfsinfo: level = 1005
[2018/07/07 11:21:54.497595, 3] ../source3/smbd/trans2.c:3456(smbd_do_qfsinfo)
smbd_do_qfsinfo: level = 1007
[2018/07/07 11:21:54.498013, 3] ../source3/lib/sysquotas.c:488(sys_get_quota)
sys_get_vfs_quota() failed for mntpath[Veeam/FWTT_veeamagent] bdev[(null)] qtype[1] id[-1]: Operation not supported
[2018/07/07 11:21:54.498026, 3] ../source3/lib/sysquotas.c:488(sys_get_quota)
sys_get_vfs_quota() failed for mntpath[Veeam/FWTT_veeamagent] bdev[(null)] qtype[3] id[-1]: Operation not supported
[2018/07/07 11:21:54.499410, 3] ../source3/smbd/dir.c:657(dptr_create)
creating new dirptr 0 for path Veeam/FWTT_veeamagent, expect_close = 0
[2018/07/07 11:21:54.499456, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/FWTT_veeamagent/. fname=. (.)
[2018/07/07 11:21:54.499507, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/FWTT_veeamagent/.. fname=.. (..)
[2018/07/07 11:21:54.499578, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/FWTT_veeamagent/Backup_Job_SRVFS2 fname=Backup_Job_SRVFS2 (Backup_Job_SRVFS2)
[2018/07/07 11:21:54.529565, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/FWTT_veeamagent/Backup_Job_SRVDC1 fname=Backup_Job_SRVDC1 (Backup_Job_SRVDC1)
[2018/07/07 11:21:54.529648, 3] ../source3/smbd/smb2_server.c:3115(smbd_smb2_request_error_ex)
smbd_smb2_request_error_ex: smbd_smb2_request_error_ex: idx[9] status[STATUS_NO_MORE_FILES] || at ../source3/smbd/smb2_query_directory.c:155
[2018/07/07 11:21:54.531117, 3] ../source3/smbd/trans2.c:3456(smbd_do_qfsinfo)
smbd_do_qfsinfo: level = 1007
[2018/07/07 11:21:54.531472, 3] ../source3/lib/sysquotas.c:488(sys_get_quota)
sys_get_vfs_quota() failed for mntpath[Veeam/FWTT_veeamagent] bdev[(null)] qtype[1] id[-1]: Operation not supported
[2018/07/07 11:21:54.531485, 3] ../source3/lib/sysquotas.c:488(sys_get_quota)
sys_get_vfs_quota() failed for mntpath[Veeam/FWTT_veeamagent] bdev[(null)] qtype[3] id[-1]: Operation not supported
[2018/07/07 11:21:57.975048, 2] ../source3/smbd/open.c:1404(open_file)
veeam opened file Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS2.vbm_106_tmp read=No write=Yes (numopen=1)
[2018/07/07 11:21:57.976698, 3] ../source3/smbd/smb2_server.c:3115(smbd_smb2_request_error_ex)
smbd_smb2_request_error_ex: smbd_smb2_request_error_ex: idx[1] status[STATUS_NOTIFY_ENUM_DIR] || at ../source3/smbd/smb2_notify.c:123
[2018/07/07 11:21:57.978261, 3] ../source3/smbd/smb2_notify.c:250(smbd_smb2_notify_send)
smbd_smb2_notify_send: notify change called on Veeam/FWTT_veeamagent/Backup_Job_SRVFS2, filter = FILE_NAME|DIR_NAME, recursive = 1
[2018/07/07 11:21:57.978277, 3] ../source3/smbd/smb2_write.c:212(smb2_write_complete_internal)
smb2: fnum 4107690722, file Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS2.vbm_106_tmp, length=26510 offset=0 wrote=26510
[2018/07/07 11:21:57.978670, 3] ../source3/smbd/smb2_server.c:3115(smbd_smb2_request_error_ex)
smbd_smb2_request_error_ex: smbd_smb2_request_error_ex: idx[1] status[STATUS_NOTIFY_ENUM_DIR] || at ../source3/smbd/smb2_notify.c:123
[2018/07/07 11:21:57.978693, 2] ../source3/smbd/close.c:789(close_normal_file)
veeam closed file Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS2.vbm_106_tmp (numopen=0) NT_STATUS_OK
[2018/07/07 11:21:57.980892, 3] ../source3/smbd/dir.c:657(dptr_create)
creating new dirptr 0 for path Veeam/FWTT_veeamagent/Backup_Job_SRVFS2, expect_close = 0
[2018/07/07 11:21:57.980947, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/. fname=. (.)
[2018/07/07 11:21:57.981024, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/.. fname=.. (..)
[2018/07/07 11:21:57.981108, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS22018-07-07T100908.vbk fname=Backup Job SRVFS22018-07-07T100908.vbk (Backup Job SRVFS22018-07-07T100908.vbk)
[2018/07/07 11:21:57.981186, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS2.vbm fname=Backup Job SRVFS2.vbm (Backup Job SRVFS2.vbm)
[2018/07/07 11:21:57.981256, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS2.vbm_106_tmp fname=Backup Job SRVFS2.vbm_106_tmp (Backup Job SRVFS2.vbm_106_tmp)
[2018/07/07 11:21:57.981314, 3] ../source3/smbd/smb2_server.c:3115(smbd_smb2_request_error_ex)
smbd_smb2_request_error_ex: smbd_smb2_request_error_ex: idx[5] status[STATUS_NO_MORE_FILES] || at ../source3/smbd/smb2_query_directory.c:155
[2018/07/07 11:21:57.983092, 2] ../source3/smbd/open.c:1404(open_file)
veeam opened file Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS2.vbm read=No write=No (numopen=1)
[2018/07/07 11:21:57.983481, 3] ../source3/smbd/trans2.c:8430(smbd_do_setfilepathinfo)
smbd_do_setfilepathinfo: Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS2.vbm (fnum 3777892918) info_level=1013 totdata=1
[2018/07/07 11:21:57.983966, 2] ../source3/smbd/close.c:789(close_normal_file)
veeam closed file Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS2.vbm (numopen=0) NT_STATUS_OK
[2018/07/07 11:21:57.985106, 2] ../source3/smbd/open.c:1404(open_file)
veeam opened file Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS2.vbm_106_tmp read=No write=No (numopen=1)
[2018/07/07 11:21:57.985550, 3] ../source3/smbd/trans2.c:8430(smbd_do_setfilepathinfo)
smbd_do_setfilepathinfo: Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS2.vbm_106_tmp (fnum 3265471589) info_level=65290 totdata=146
[2018/07/07 11:21:57.985799, 3] ../source3/smbd/reply.c:6866(rename_internals_fsp)
rename_internals_fsp: succeeded doing rename on Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS2.vbm_106_tmp -> Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS2.vbm
[2018/07/07 11:21:57.986208, 2] ../source3/smbd/close.c:789(close_normal_file)
veeam closed file Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS2.vbm (numopen=0) NT_STATUS_OK
[2018/07/07 11:21:57.987821, 3] ../source3/smbd/dir.c:657(dptr_create)
creating new dirptr 0 for path Veeam/FWTT_veeamagent/Backup_Job_SRVFS2, expect_close = 0
[2018/07/07 11:21:57.987876, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/. fname=. (.)
[2018/07/07 11:21:57.987956, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/.. fname=.. (..)
[2018/07/07 11:21:57.988027, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS22018-07-07T100908.vbk fname=Backup Job SRVFS22018-07-07T100908.vbk (Backup Job SRVFS22018-07-07T100908.vbk)
[2018/07/07 11:21:57.988096, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS2.vbm fname=Backup Job SRVFS2.vbm (Backup Job SRVFS2.vbm)
[2018/07/07 11:21:57.988145, 3] ../source3/smbd/smb2_server.c:3115(smbd_smb2_request_error_ex)
smbd_smb2_request_error_ex: smbd_smb2_request_error_ex: idx[9] status[STATUS_NO_MORE_FILES] || at ../source3/smbd/smb2_query_directory.c:155
[2018/07/07 11:21:58.986362, 3] ../source3/smbd/smb2_server.c:3115(smbd_smb2_request_error_ex)
smbd_smb2_request_error_ex: smbd_smb2_request_error_ex: idx[1] status[NT_STATUS_CANCELLED] || at ../source3/smbd/smb2_notify.c:123
[2018/07/07 11:21:58.986576, 3] ../lib/util/access.c:361(allow_access)
Allowed connection from srvfs2.fwtt.local (172.17.37.21)
[2018/07/07 11:21:58.986657, 3] ../source3/smbd/service.c:595(make_connection_snum)
Connect path is '/tmp' for service [IPC$]
[2018/07/07 11:21:58.986687, 3] ../source3/smbd/vfs.c:113(vfs_init_default)
Initialising default vfs hooks
[2018/07/07 11:21:58.986700, 3] ../source3/smbd/vfs.c:139(vfs_init_custom)
Initialising custom vfs hooks from [/[Default VFS]/]
[2018/07/07 11:21:58.986820, 3] ../source3/smbd/service.c:841(make_connection_snum)
srvfs2 (ipv4:172.17.37.21:65215) connect to service IPC$ initially as user veeam (uid=1000, gid=1000) (pid 81175)
[2018/07/07 11:21:58.988583, 3] ../source3/smbd/msdfs.c:1008(get_referred_path)
get_referred_path: |Backup| in dfs path \172.17.37.233\Backup is not a dfs root.
[2018/07/07 11:21:58.988602, 3] ../source3/smbd/smb2_server.c:3115(smbd_smb2_request_error_ex)
smbd_smb2_request_error_ex: smbd_smb2_request_error_ex: idx[1] status[NT_STATUS_NOT_FOUND] || at ../source3/smbd/smb2_ioctl.c:309
[2018/07/07 11:21:58.990773, 3] ../source3/smbd/smb2_notify.c:250(smbd_smb2_notify_send)
smbd_smb2_notify_send: notify change called on Veeam/FWTT_veeamagent, filter = FILE_NAME|DIR_NAME|ATTRIBUTES|LAST_WRITE, recursive = 0
[2018/07/07 11:21:58.990863, 3] ../source3/smbd/dir.c:657(dptr_create)
creating new dirptr 0 for path Veeam/FWTT_veeamagent/Backup_Job_SRVFS2, expect_close = 0
[2018/07/07 11:21:58.990924, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/. fname=. (.)
[2018/07/07 11:21:58.991006, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/.. fname=.. (..)
[2018/07/07 11:21:58.991086, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS22018-07-07T100908.vbk fname=Backup Job SRVFS22018-07-07T100908.vbk (Backup Job SRVFS22018-07-07T100908.vbk)
[2018/07/07 11:21:58.991175, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS2.vbm fname=Backup Job SRVFS2.vbm (Backup Job SRVFS2.vbm)
[2018/07/07 11:21:58.991231, 3] ../source3/smbd/smb2_server.c:3115(smbd_smb2_request_error_ex)
smbd_smb2_request_error_ex: smbd_smb2_request_error_ex: idx[5] status[STATUS_NO_MORE_FILES] || at ../source3/smbd/smb2_query_directory.c:155
[2018/07/07 11:21:59.008956, 3] ../source3/smbd/smb2_server.c:3115(smbd_smb2_request_error_ex)
smbd_smb2_request_error_ex: smbd_smb2_request_error_ex: idx[1] status[NT_STATUS_FS_DRIVER_REQUIRED] || at ../source3/smbd/smb2_ioctl.c:309
[2018/07/07 11:21:59.978347, 3] ../source3/smbd/smb2_notify.c:250(smbd_smb2_notify_send)
smbd_smb2_notify_send: notify change called on Veeam/FWTT_veeamagent/Backup_Job_SRVFS2, filter = FILE_NAME|DIR_NAME, recursive = 1
[2018/07/07 11:21:59.978380, 3] ../source3/smbd/smb2_server.c:3115(smbd_smb2_request_error_ex)
smbd_smb2_request_error_ex: smbd_smb2_request_error_ex: idx[1] status[STATUS_NOTIFY_ENUM_DIR] || at ../source3/smbd/smb2_notify.c:123
[2018/07/07 11:22:00.983215, 3] ../source3/smbd/smb2_server.c:3115(smbd_smb2_request_error_ex)
smbd_smb2_request_error_ex: smbd_smb2_request_error_ex: idx[1] status[NT_STATUS_CANCELLED] || at ../source3/smbd/smb2_notify.c:123
[2018/07/07 11:22:00.985391, 3] ../source3/smbd/smb2_notify.c:250(smbd_smb2_notify_send)
smbd_smb2_notify_send: notify change called on Veeam/FWTT_veeamagent, filter = FILE_NAME|DIR_NAME|ATTRIBUTES|LAST_WRITE, recursive = 0
[2018/07/07 11:22:00.986176, 3] ../source3/smbd/dir.c:657(dptr_create)
creating new dirptr 0 for path Veeam/FWTT_veeamagent/Backup_Job_SRVFS2, expect_close = 0
[2018/07/07 11:22:00.986221, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/. fname=. (.)
[2018/07/07 11:22:00.986309, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/.. fname=.. (..)
[2018/07/07 11:22:00.986382, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS22018-07-07T100908.vbk fname=Backup Job SRVFS22018-07-07T100908.vbk (Backup Job SRVFS22018-07-07T100908.vbk)
[2018/07/07 11:22:00.986468, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS2.vbm fname=Backup Job SRVFS2.vbm (Backup Job SRVFS2.vbm)
[2018/07/07 11:22:00.986515, 3] ../source3/smbd/smb2_server.c:3115(smbd_smb2_request_error_ex)
smbd_smb2_request_error_ex: smbd_smb2_request_error_ex: idx[5] status[STATUS_NO_MORE_FILES] || at ../source3/smbd/smb2_query_directory.c:155
[2018/07/07 11:22:00.994537, 3] ../source3/smbd/smb2_server.c:3115(smbd_smb2_request_error_ex)
smbd_smb2_request_error_ex: smbd_smb2_request_error_ex: idx[1] status[NT_STATUS_FS_DRIVER_REQUIRED] || at ../source3/smbd/smb2_ioctl.c:309
[2018/07/07 11:22:01.978613, 3] ../source3/smbd/smb2_notify.c:250(smbd_smb2_notify_send)
smbd_smb2_notify_send: notify change called on Veeam/FWTT_veeamagent/Backup_Job_SRVFS2, filter = FILE_NAME|DIR_NAME, recursive = 1
[2018/07/07 11:22:02.984535, 3] ../source3/smbd/smb2_server.c:3115(smbd_smb2_request_error_ex)
smbd_smb2_request_error_ex: smbd_smb2_request_error_ex: idx[1] status[NT_STATUS_CANCELLED] || at ../source3/smbd/smb2_notify.c:123
[2018/07/07 11:22:02.987647, 3] ../source3/smbd/smb2_notify.c:250(smbd_smb2_notify_send)
smbd_smb2_notify_send: notify change called on Veeam/FWTT_veeamagent, filter = FILE_NAME|DIR_NAME|ATTRIBUTES|LAST_WRITE, recursive = 0
[2018/07/07 11:22:03.005042, 3] ../source3/smbd/smb2_server.c:3115(smbd_smb2_request_error_ex)
smbd_smb2_request_error_ex: smbd_smb2_request_error_ex: idx[1] status[NT_STATUS_FS_DRIVER_REQUIRED] || at ../source3/smbd/smb2_ioctl.c:309
[2018/07/07 11:22:07.308240, 3] ../source3/smbd/service.c:1120(close_cnum)
srvdc1 (ipv4:172.17.37.3:63052) closed connection to service IPC$
[2018/07/07 11:22:10.070053, 2] ../source3/smbd/server.c:803(remove_child_pid)
Could not find child 91113 -- ignoring
[2018/07/07 11:22:13.314244, 2] ../source3/smbd/service.c:1120(close_cnum)
srvdc1 (ipv4:172.17.37.3:63052) closed connection to service Backup
[2018/07/07 11:22:13.316491, 3] ../source3/smbd/server_exit.c:248(exit_server_common)
Server exit (NT_STATUS_CONNECTION_RESET)
[2018/07/07 11:22:15.455963, 3] ../source3/smbd/service.c:1120(close_cnum)
srvfs2 (ipv4:172.17.37.21:65215) closed connection to service IPC$
[2018/07/07 10:36:22.759700, 3] ../source3/smbd/smb2_write.c:212(smb2_write_complete_internal)
smb2: fnum 2843473916, file Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS22018-07-07T100908.vbk, length=65536 offset=0 wrote=65536
[2018/07/07 10:36:22.760089, 3] ../source3/smbd/smb2_write.c:212(smb2_write_complete_internal)
smb2: fnum 2843473916, file Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS22018-07-07T100908.vbk, length=61440 offset=0 wrote=61440
[2018/07/07 10:37:12.169781, 3] ../lib/util/access.c:361(allow_access)
Allowed connection from srvfs2.fwtt.local (172.17.37.21)
[2018/07/07 10:37:12.169852, 3] ../source3/smbd/service.c:595(make_connection_snum)
Connect path is '/tmp' for service [IPC$]
[2018/07/07 10:37:12.169878, 3] ../source3/smbd/vfs.c:113(vfs_init_default)
Initialising default vfs hooks
[2018/07/07 10:37:12.169883, 3] ../source3/smbd/vfs.c:139(vfs_init_custom)
Initialising custom vfs hooks from [/[Default VFS]/]
[2018/07/07 10:37:12.169990, 3] ../source3/smbd/service.c:841(make_connection_snum)
srvfs2 (ipv4:172.17.37.21:65215) connect to service IPC$ initially as user veeam (uid=1000, gid=1000) (pid 81175)
[2018/07/07 10:37:12.170341, 3] ../source3/smbd/msdfs.c:1008(get_referred_path)
get_referred_path: |Backup| in dfs path \172.17.37.233\Backup is not a dfs root.
[2018/07/07 10:37:12.170355, 3] ../source3/smbd/smb2_server.c:3115(smbd_smb2_request_error_ex)
smbd_smb2_request_error_ex: smbd_smb2_request_error_ex: idx[1] status[NT_STATUS_NOT_FOUND] || at ../source3/smbd/smb2_ioctl.c:309
[2018/07/07 10:37:12.185749, 3] ../source3/smbd/dir.c:657(dptr_create)
creating new dirptr 0 for path Veeam/FWTT_veeamagent/Backup_Job_SRVFS2, expect_close = 0
[2018/07/07 10:37:12.185827, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/. fname=. (.)
[2018/07/07 10:37:12.205951, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/.. fname=.. (..)
[2018/07/07 10:37:12.232964, 3] ../source3/smbd/dir.c:1228(smbd_dirptr_get_entry)
smbd_dirptr_get_entry mask=[*] found Veeam/FWTT_veeamagent/Backup_Job_SRVFS2/Backup Job SRVFS22018-07-07T100908.vbk fname=Backup Job SRVFS22018-07-07T100908.vbk (Backup Job SRVFS22018-07-07T100908.vbk)
Einen direkten Fehler der das Verhalten erklären könnte erkenne ich daraus nicht. Die anderen Fehlermeldungen sind glaube ich anderer natur oder? entstehen jedenfalls immer nur beim wiederverbinden des Windows Servers auf die NAS.
Diese Stelle versteh ich nicht. Warum sollte Veeam die Datei closen?