https://t.me/RX1948
Server : Apache
System : Linux iad1-shared-b8-43 6.6.49-grsec-jammy+ #10 SMP Thu Sep 12 23:23:08 UTC 2024 x86_64
User : dh_edsupp ( 6597262)
PHP Version : 8.2.26
Disable Function : NONE
Directory :  /lib/python3/dist-packages/hgext/remotefilelog/__pycache__/

Upload File :
current_dir [ Writeable ] document_root [ Writeable ]

 

Current File : //lib/python3/dist-packages/hgext/remotefilelog/__pycache__/repack.cpython-310.pyc
o

�]Lb�y�@s^ddlmZddlZddlZddlmZddlmZddlm	Z	m
Z
mZm
Z
mZmZmZmZmZddlmZddlmZmZmZmZmZmZe�d	�ZGd
d�de
j�Zd0dd�Z d1dd�Z!d1dd�Z"dd�Z#dd�Z$dd�Z%	
	d2dd�Z&dd�Z'dd�Z(d d!�Z)d"d#�Z*	d3d$d%�Z+d1d&d'�Z,Gd(d)�d)e-�Z.Gd*d+�d+e-�Z/Gd,d-�d-e-�Z0d.d/�Z1dS)4�)�absolute_importN)�_��short)	�encoding�error�lock�mdiff�policy�pycompat�scmutil�util�vfs)�procutil�)�	constants�contentstore�datapack�historypack�
metadatastore�shallowutil�osutilc@seZdZdS)�RepackAlreadyRunningN)�__name__�
__module__�__qualname__�rr�</usr/lib/python3/dist-packages/hgext/remotefilelog/repack.pyr srTFcCs�t��d|jdg}td�}|r|�d�td�}|r|�d�|j�|�i}|j�dd�r4|jj|d	<tj	|t
jfd
di|��dS)Ns-R�repacks(running background repack)
s
--incrementals((running background incremental repack)
s--packsonlysdevelsremotefilelog.bg-wait�record_wait�ensurestartF)r�hgexecutable�origrootr�append�ui�warn�
configbool�atexit�runbgcommandr�environ)�repo�incremental�	packsonly�cmd�msg�kwargsrrr�backgroundrepack$s

r0c
Cs�t�|d�r'tj|j�}tj|jddi�}t�	|t
j�}t||||t
j|d�t�|j
d�rst|�\}}|\}}}	|\}
}}tj|�}tj|ddi�}t||||
t
j|d�tj|ddi�}tj|	ddi�}t||||t
j|d�dSdS)zIIf ``packsonly`` is True, stores creating only loose objects are skipped.�shareddatastores�allowincompleteT��options�	datastoreN)r
�safehasattrr�unioncontentstorer1r�unionmetadatastore�sharedhistorystoresr�getcachepackpathr�FILEPACK_CATEGORY�
_runrepack�manifestlog�_getmanifeststores�TREEPACK_CATEGORY)
r*r4�
datasource�
historysource�packpath�	localdata�
shareddata�	lpackpath�ldstores�lhstores�	spackpath�sdstores�shstoresrrr�
fullrepack5sn����	


���
����
��rKc	Cs�t�|d�rt�|tj�}t||j|j|tj|d�t�|j	d�rJt
|�\}}|\}}}|\}}	}
t||	|
|tj|d�t||||tjd|d�dSdS)z�This repacks the repo by looking at the distribution of pack files in the
    repo and performing the most minimal repack to keep the repo in good shape.
    r1r3r5T)�allowincompletedatar4N)r
r6rr:rr;�_incrementalrepackr1r9r=r>r?)r*r4rBrCrDrErFrGrHrIrJrrr�incrementalrepackmsD��	

�

��rNcCsV|jj}|jj}|jj}|jj}t�|tj�}t�	|j
jjtj�}|||f|||ffS�N)
r=r1�localdatastoresr9�localhistorystoresrr:rr?�getlocalpackpath�svfsr�base)r*r1rPr9rQ�sharedpackpath�
localpackpathrrrr>�s���r>cs0t�fdd�|D��}t�fdd�|D��}|S)Nc3s�|]
}tj��|�VqdSrO)�os�path�join��.0�p)rBrr�	<genexpr>�s�z_topacks.<locals>.<genexpr>c3s�|]}�|�VqdSrOrrZ)�constructorrrr]���)�list)rB�filesr^�paths�packsr)r^rBr�_topacks�srdcs�|j�dd���dkr
|Sddg���fdd�|D���D](}tj�||�}�D]}||}|j�d|t�t�|�j	�f�t�
|�q(q�fd	d
�|D�S)zsDeletes packfiles that are bigger than ``packs.maxpacksize``.

    Returns ``files` with the removed files omitted.�packssmaxpacksizers	.datapacks.dataidxcs@h|]\}}}|j�krtj�|�d�vrtj�|�d�qS)rr)�st_sizerWrX�splitext)r[rX�ftype�stat)�	VALIDEXTS�maxsizerr�	<setcomp>�s
�z"_deletebigpacks.<locals>.<setcomp>s#removing oversize packfile %s (%s)
cs$g|]}tj�|d��vr|�qS�r)rWrX�basename)r[�row)�	oversizedrr�
<listcomp>�s$z#_deletebigpacks.<locals>.<listcomp>)r$�configbytesrWrXrY�debugr
�	bytecountrirf�unlink)r*�folderra�	rootfname�rootpath�extrXr)rjrkrpr�_deletebigpacks�s&����rzc
Cs�t�|j|�tj|dd�}t|||�}t|t|j|�tj�}|�	dd�|D��t|t
|j|�tj�}	|	�	dd�|D��t|tj
tj�}
t|dd�|
D�tj�}|�	dd�|D��t|tj|d|i�tj|	ddi�||tj|ddi�|d	�dS)
NT)rics� �|]}t|tj�s|VqdSrO)�
isinstancer�
datapackstore�r[�srrrr]�s��
�z%_incrementalrepack.<locals>.<genexpr>csr{rO�r|r�historypackstorer~rrrr]���
�
�css�|]\}}}|VqdSrOr)r[�f�moderirrrr]���csr{rOr�r~rrrr]�r�r2)�fullhistoryr4)r�mkstickygroupdirr$r�listdirrzrd�_computeincrementaldatapackr�extend�_computeincrementalhistorypackr�_allpackfileswithsuffix�
PACKSUFFIX�INDEXSUFFIXr<rr7rr8)r*r5�historystorerB�categoryrLr4ra�	datapacks�historypacks�allhistoryfiles�allhistorypacksrrrrM�sZ	�
�
�
�	
��
�����
�rMcCsR|�dd�|�dd�|�dd�|�dd�|�dd�d�}t|tjtj�}t||�S)N�
remotefilelogsdata.gencountlimitsdata.generationssdata.maxrepackpackssdata.repackmaxpacksizesdata.repacksizelimit��
gencountlimit�generations�maxrepackpacks�repackmaxpacksize�repacksizelimit)�	configint�
configlistrrr�rr�r��_computeincrementalpack�r$ra�opts�	packfilesrrrr�s 

����
�
r�cCsX|�dd�|�dddg�|�dd�|�ddd�|�dd�d	�}t|tjtj�}t||�S)
Nr�shistory.gencountlimitshistory.generationss100MBshistory.maxrepackpacksshistory.repackmaxpacksizes400MBshistory.repacksizelimitr�)r�r�rrr�rr�r�r�r�rrrr�%s(������
�
r�c	Csbg}dd�|D�}|D]#\}}}|�|�sq|dt|��}|||vr&q|�|||f�q|S)NcSsh|]\}}}|�qSrr)r[�fnr�rirrrrl@sz*_allpackfileswithsuffix.<locals>.<setcomp>)�endswith�lenr#)	ra�
packsuffix�indexsuffix�result�fileset�filenamer�ri�prefixrrrr�>s
r�c
	s�ttdd�|dD�dd��}|�d�g}t�t|��D]}|�g�qi�|D])\}}}|j}||dkr8q)|�|<t|�D]\}}	||	krQ||�|�nq@q)g}
t|�D] \}}	t||�|dkry|
�t||d�fd	d
�d��nqY|
dd
�}|
d
d�}
t	�fdd�|D��}||dkr�|
r�t|�|dkr�|�|
�
��|�|d7}||dkr�|
r�t|�|dks�|S)a`Given a set of pack files along with the configuration options, this
    function computes the list of files that should be packed as part of an
    incremental repack.

    It tries to strike a balance between keeping incremental repacks cheap (i.e.
    packing small things when possible, and rolling the packs up to the big ones
    over time).
    css�|]}t�|�VqdSrO)r
�	sizetointr~rrrr]Zr�z*_computeincrementalpack.<locals>.<genexpr>r�T)�reverserr�r�cs�|SrOr)�x��sizesrr�<lambda>z�z)_computeincrementalpack.<locals>.<lambda>)r��key���Nc3s�|]}�|VqdSrOr�r[�nr�rrr]�r_r�r����)r`�sortedr#r�xranger�rf�	enumerater��sum�pop)
rar��limits�generations�ir�r�ri�size�limit�genpacks�chosenpacks�
repacksizerr�rr�OsR
�
��
�����r�c
Cs�t�|j|�dd�}|j�dd�}|s|}t||||||||d�}	t�|j|��?}
t�|j|��}z|	�	|
|�Wnt
jyHtt
d���wWd�n1sSwYWd�dSWd�dS1skwYdS)NcSsD|j||d�}||����}|j�dd�}t��|}|d|kS)z�Check if the file node is older than a limit.
        Unless a limit is specified in the config the default limit is taken.
        )�fileidr�snodettlr)�filectx�linkrev�dater$r��time)r*r��noder��filetime�ttlr�rrr�isold�s
z_runrepack.<locals>.isoldr�sgcrepack)�gcr�r4s3skipping repack - another repack is already running)rr�r$r&�repackerr�mutabledatapackr�mutablehistorypack�runr�LockHeldrr)r*�data�historyrBr�r�r4r��garbagecollect�packer�dpack�hpackrrrr<�s>�������"�r<cCsh|st�}n|}gd�}|j�ddd�}|r|�d|�|j�ddd�}|r.|�d|�d�|�}d|g}t�||�}t�}d}t|j�D]i}	||	�	��
�|vr]||	����}
n|ri||	�
��|�}
n||	�
�}
||	�
�}|�|	�t|
�tur�t�|
�D]\}}|d	d	dur�|�|||d	d	��q�qHt�|
�D]\}}
|�|||
��q�qH|S)
aComputes a keepset which is not garbage collected.
    'keyfn' is a function that maps filename, node to a unique key.
    'lastkeepkeys' is an optional argument and if provided the keepset
    function updates lastkeepkeys with more keys and returns the result.
    )�.sdraft()sparents(draft())r�spullprefetchNs(%s)sbgprefetchrevs�+ssort((%s), "topo")r)�setr$�configr#rYr�revrange�reversed�_list�p1�rev�manifestctx�	readdelta�manifest�diff�add�type�dictr�	iteritems)r*�keyfn�lastkeepkeys�keepkeys�revs�prefetchrevs�keep�	processed�lastmanifest�r�mr�r��filenoderrr�keepset�sB


���r�c@sHeZdZdZ			ddd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	dS)r�z^Class for orchestrating the repack of data and history information into a
    new format.
    FNc		Cs`||_||_||_||_t�|�|_||_||_|jr.|s!t	d��t
|dd��|_||_dSdS)Ns*Function 'isold' is not properly specifiedcSs||fSrOr)r�r�rrrr�r�z#repacker.__init__.<locals>.<lambda>)
r*r�r�r�r�getunits�unitr�r4�
ValueErrorr�r�r�)	�selfr*r�r�r�r�r�r�r4rrr�__init__�s
�zrepacker.__init__cCs�t�}tjt|j�dddd��8|j�d�|jj||jd�|j	j||jd�|�
||�|�||�|jD]}|�
|�q7Wd�dS1sJwYdS)Ns
repacklockr)�desc�timeouts	prerepackr3)�repackledger�lockmodr�
repacklockvfsr*�hookr��
markledgerr4r��
repackdata�
repackhistory�sources�cleanup)r��
targetdata�
targethistory�ledger�sourcerrrr�s�
�"�zrepacker.runc
s��s|Sd
��fdd�	}t���tt�|dd���|jr-|�d�d�dd	��D��f�t��D]#\}}|dkrB�jjdf||<q1�|d
}	|	||	d
d
f||<q1�fdd	�|D�}|�7}|S)aqReorderes ``orphans`` into a single chain inside ``nodes`` and
        ``deltabases``.

        We often have orphan entries (nodes without a base that aren't
        referenced by other nodes -- i.e., part of a chain) due to gaps in
        history. Rather than store them as individual fulltexts, we prefer to
        insert them as one chain sorted by size.
        rcs&�j��|�}tj|vr|tjS|SrO)r��getmetar�METAKEYSIZE)r��default�meta)r�r�rr�getsize5s

z'repacker._chainorphans.<locals>.getsizeT)r�r�s%s: orphan chain: %s
s, cSsg|]}t|��qSrrr~rrrrqDsz*repacker._chainorphans.<locals>.<listcomp>rcsg|]}|�vr|�qSrrr�)�orphansrrrqNsNrm)r�r`�	debugflagrsrYr�r*�nullid)
r�r$r��nodesr�
deltabasesrr�r��parentr)r�rr�r�
_chainorphans)s&	
��zrepacker._chainorphansc'
s|jj}|�ddd�}i}t�|j�D]}|jr"||�|ji�|j	<qd}|j
td�|jt
|�d�}tt�|��D�]�\}	}
|�|�i}tdd�|
D���g}|j
td	�d
t
��d�}
t��D]*\}}||vriq`|
�|�z|�|jj|	||d��Wq`ty�|�|�Yq`w|
��tt|�|���}t
|�dkr�|�dt
|��|�t|��tt�fd
d�|��}|jr�g}|D]}|	|f|jvr�|�|j|	|�r�d|
|_ q�|�|�q�|}i}t!�}t!�}t!���|j
td�d
t
|�d�}t|�D]j\}}|�|�|�"|d�}|du�r$|jj#d}}|jj#df||<|�$|�n	|\}}|�$|�|�"|�}|�re|\}}}}|�rB|jj#}||k�re||jj#k�rV||df||<||jj#k�re||df||<q�|�%dd��rz||}|�&||	|||�}t|�D]x\}}||\}}||jj#k�r�|j'�(|	|�}|\} }!}"}#|#�"t)j*�}$|!|	k�s�|"|k�s�|$du�r�|j'�"|	|�}%|j'�"|	|�}&t
|&�}$t+�,|%|&�} n|j'�"|	|�} t
| �}$|j'�-|	|�}#t)j*|#v�r�|$|#t)j*<|�$|	||| |#�d|
|_.�q~|��|d7}q9|��|j/|d�dS)Nresmaxchainleni�rsrepacking data�r��totalcs��|]}|VqdSrOr�r[r�rrrr]c��z&repacker.repackdata.<locals>.<genexpr>sbuilding historysnodes��knowns%repackdata: %d nodes without history
cs|�vSrOr)r��rrrr��r�z%repacker.repackdata.<locals>.<lambda>Tsprocessing nodesrrschainorphansbysize�r)0r*r$r�r�
itervalues�entriesr@�
setdefaultr�r��makeprogressrr�r�r�r��updater`r�r��getancestors�KeyErrorr#�completer��	_toposortrsr��filterr�r�r��gcedr��getrr�r&rr��getdeltarrr	�textdiffr
�datarepacked�close)'r�r�targetr$�maxchainlen�byfile�entry�count�repackprogressr�r �	ancestors�	nohistory�
buildprogressr�r��orderednodes�neworderednodesr�nobase�
referenced�processprogress�
deltatuple�	deltabase�chainlen�ancestorinfor��p2�linknode�copyfromr�
deltaentry�delta�
deltabasename�
origdeltabaser
r��
deltabasetext�originalrrrrRs���
�
���
���
�




�
�


�

zrepacker.repackdatac
Cst|jj}i}t�|j�D]}|jr||�|ji�|j<q|j	t
d�|jt|�d�}t
t�|��D]}\}}i}	tdd�|D��}
|
D]}||	vrHqA|	�|jj|||	d��qAt|�|	��}t�}
|D]G}|	|\}}}}||
vr�||vr�||jjkr�||kr�|
�|�||jjkr�|
�|�qa|r�|
�|�|�||||||�||vr�d||_qa|��q0|��|j|d�dS)Nsrepacking historyrcsrrOrrrrrr]�rz)repacker.repackhistory.<locals>.<genexpr>rTr)r*r$rrr rAr!r�r�r"rr�r�r�r�r`r#r�r$r�r'r�rr��historyrepacked�	incrementr&r.)r�rr/r$r1r2�progressr�r r5rr�r8�dontprocessr�rArBrCrrrr�sJ���	



�
zrepacker.repackhistorycs"��fdd�}t����|�}|S)NcsD�|\}}}}g}|�jjkr|�|�|�jjkr |�|�|SrO)r*rr#)r�r�rArBrC�parents�r5r�rr�
parentfunc+s

z&repacker._toposort.<locals>.parentfunc)r�	sortnodes�keys)r�r5rP�sortednodesrrOrr'*s	zrepacker._toposort)FNN)
rrr�__doc__r�r�rrrr'rrrrr��s
�)<r�c@s8eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�ZdS)
r�z�Storage for all the bookkeeping that happens during a repack. It contains
    the list of revisions being repacked, what happened to each revision, and
    which source store contained which revision originally (for later cleanup).
    cCsi|_i|_t�|_dSrO)r rr��created)r�rrrr�>szrepackledger.__init__cC�@|�||�}d|_|j�|�}|st�}||j|<|�|�dS)z`Mark the given filename+node revision as having a data rev in the
        given source.
        TN)�_getorcreateentryr@rr*r�r��r�r	r�r�r2r rrr�
markdataentryC�
zrepackledger.markdataentrycCrV)zcMark the given filename+node revision as having a history rev in the
        given source.
        TN)rWrArr*r�r�rXrrr�markhistoryentryOrZzrepackledger.markhistoryentrycCs0||f}|j�|�}|st||�}||j|<|SrO)r r*�repackentry)r�r�r�r��valuerrrrW[s

zrepackledger._getorcreateentrycCs|j�|�dSrO)rUr�)r�r]rrr�
addcreateddszrepackledger.addcreatedN)	rrrrTr�rYr[rWr^rrrrr�8s	r�c@seZdZdZdZdd�ZdS)r\zFSimple class representing a single revision entry in the repackledger.�r�r�r@rAr-rJr)cCs.||_||_d|_d|_d|_d|_d|_dS)NFr_)r�r�r�rrrr�us
zrepackentry.__init__N)rrrrT�	__slots__r�rrrrr\hs
r\cCs*t�|d�rt�|tj�}t�|�S|jS)N�name)r
r6rr:rr;rrS)r*�sharedcachepathrrrr��s�
r�)TFrO)FN)NN)2�
__future__rrWr��mercurial.i18nr�mercurial.noder�	mercurialrrrr�r	r
rrr
r�mercurial.utilsr�rrrrrr�	importmodr�Abortrr0rKrNr>rdrzrMr�r�r�r�r<r��objectr�r�r\r�rrrr�<module>s@, 
	


8,&
�>?
�
,>C0

https://t.me/RX1948 - 2025